CSC/ECE 517 Fall 2019 - E1990. Integrate suggestion detection algorithm: Difference between revisions

From Expertiza_Wiki
Jump to navigation Jump to search
Line 103: Line 103:


===References===
===References===
# [http://wiki.expertiza.ncsu.edu/index.php/Main_Page Expertiza]
# [https://github.com/expertiza/expertiza/pull/1493 Pull request for a Metrics Legend change]
# [https://github.com/expertiza/expertiza Expertiza Github Repo]

Revision as of 22:07, 15 November 2019

Introduction

  • On Expertiza, students receive review comments for their work from their peers. This review mechanism provides the students a chance to correct/modify their work, based on the reviews they receive. It is expected that the reviewers identify problems and suggest solutions, so that the students can improve their projects.
  • The Instructor is facilitated with metrics such as average volume and total volume of the content of the reviews provided by a student.
  • It is observed that the students learn more by reviewing others' work than working on the project as it gives them perspective about alternative approaches to solve a problem.

Problem Statement

  • The reviewers can fill in the review comments on others' work, however they do not receive feedback on how effective their reviews are for the receiver. It would thus make sense to have a feedback mechanism in place, which can identify whether a reviewer has identified problems and provided suggestions for a student or team's project. In order to achieve this, we need to identify the suggestions in the review comments as a way of determinimg how useful a review would be. This would motivate the reviewers to give better and constructive reviews.
  • We would want the instructor of the course to be able to view how many constructive reviews were provided by a reviewer in comparison to the average number of constructive reviews provided by the other reviewers of the course. The reviews could be graded on this basis.

Current Implementation

  • Currently, reviewers can only submit reviews for their peers and their comments are not evaluated instantly.
  • Instructor can view insights regarding the text comments entered by the reviewers in terms of the volume of the text. Instructors do not have other metrics with regards to the number of problems and suggestions identified by the reviewers.

Proposed Solutions

  • On the student review page, once a student saves his/her review, a pop can show be shown with the feedback - suggestion scores, problem scores and tone analysis among others.
  • As an instructor, one might be able to check each student's review scores and his performance as compared with the class. These scores along with the metrics will be shown in the review report page under the metrics column.

Flowchart

Given below is the design flowchart of our proposed solution:

 

As a student, once the student finishes (partly or completely) giving his/her review and clicks on the "Save" or "Submit" button, the web service API will be called and the review's text will be sent as JSON. The PeerLogic web service will send back the output which will then be displayed as a pop-up to the student.

All the scores for all the reviews conducted by a student will be stored and the aggregate data will be displayed to the instructor whenever he/she will view the Review Report for any assignment/project. This will be visible in the metrics column of the review report and will be displayed student-wise, i.e., for each student participating in the assignment.

Anticipated Code Changes

Broadly speaking, the following changes will be made:

  • User: Student
 - Adding API calls of the suggestion-detection algorithm in the response_controller.rb
 - Adding pop-up to display summarized analysis for all comments in the review
 - Ensure that CORS does not need to be enabled for API call to work
 - Write unit tests for our method(s) in response_controller.rb
 - Write unit tests for our changes in response.html.erb view
  • User: Instructor
 - Adding method call to display the aggregate analysis of student comments/reviews to reports_controller.rb under Metrics column
 - Write unit tests for our method(s) in reports_controller.rb
 - Write unit tests for our changes in _review_report.html.erb

Test Plan

Below given scenarios is the basic overview of the tests planned on being written.

Automated Testing Using Rspec

We will test our project and added functionality using RSepc. Automated tests are carried out to test -

  • Whether the APIs are being called when the student clicks "Save" review button.
   Steps:
   
   1. Login to student profile.
   2. Go to your open assignments and submit your work.
   3. Go to that particular assignment and request a review.
   4. Begin writing a review. 
   5. Click "Save" button to save the review.
   6. A pop-up should be displayed that shows the review comment analysis for the student.
  • Whether the APIs are being called when the student clicks "Submit Review" button.
   Steps:
   
   1. Login to student profile.
   2. Go to your open assignments and submit your work.
   3. Go to that particular assignment and request a review.
   4. Begin writing a review. 
   5. Click "Submit Review" button to submit the review.
   6. A pop-up should be displayed that shows the review comment analysis for the student.
  • Whether the pop-up displays review comments analysis when student clicks "Save" button.
   Steps:
   
   1. (As a student) Click "Save" button to save your current review.
   2. A pop-up should be displayed that shows the review comment analysis for the student.
   3. Pop-up should display bar graph and comments on the student's review as compared to the average 
      review comments for that assignment.
   4. Pop-up should display bar graph and comments on the student's review as compared to the total
      review comments for that assignment.
  • Whether the pop-up displays review comments analysis when student clicks "Submit Review" button.
   Steps:
   
   1. (As a student) Click "Submit Review" button to submit your current review.
   2. A pop-up should be displayed that shows the review comment analysis for the student.
   3. Pop-up should display bar graph and comments on the student's review as compared to the average 
      review comments for that assignment.
   4. Pop-up should display bar graph and comments on the student's review as compared to the total
      review comments for that assignment.
  • Whether the instructor is able to see each student's aggregate review analysis score when he/she views Review Reports.
   Steps:
   
   1. Login as an instructor.
   2. Navigate to all assignments list.
   3. Click on View Reports button and open Review Report for that assignment.
   4. The instructor should be able to see each student's review comment analysis for the submitted reviews under the metric column.

There can also be other test cases to check for UI elements and control flow tests.

Certain edge cases to look out for:

  1. Instructor should not be shown nil/error values when a student has not submitted any reviews.
  2. Student should not be shown the pop-up on clicking the "back" button or after refreshing the page.

Our Work

Team Information

Mentor
Ed Gehringer (efg@ncsu.edu)
Team Name
Pizzamas
Members
  • Maharshi Parekh (mgparekh)
  • Pururaj Dave (pdave2)
  • Roshani Narasimhan (rnarasi2)
  • Yash Thakkar (yrthakka)

References

  1. Expertiza
  2. Pull request for a Metrics Legend change
  3. Expertiza Github Repo