CSC/ECE 517 Fall 2019 - E1990. Integrate suggestion detection algorithm

From Expertiza_Wiki
Revision as of 00:40, 12 November 2019 by Rnarasi2 (talk | contribs)
Jump to navigation Jump to search

Introduction

  • On Expertiza, students receive review comments for their work from their peers. This review mechanism provides the students a chance to correct/modify their work, based on the reviews they receive. It is expected that the reviewers identify problems and suggest solutions, so that the students can improve their projects.
  • The Instructor is facilitated with metrics such as average volume and total volume of the content of the reviews provided by a student.

Current Implementation

  • Currently, reviewers cannot see insights on the how many problems and suggestions have been identified in their reviews. They can only submit reviews for their peers.
  • Instructor can view insights regarding the text comments entered by the reviewers in terms of the volume of the text. Instructors do not have metrics with regards to the number of problems and suggestions identified by the reviewers.

Problem Statement

  • The reviewers can fill in the review comments on others' work, however they do not receive feedback on how effective their reviews are. It would thus make sense to have a feedback mechanism in place, which can identify whether a reviewer has identified problems and provided suggestions for a student or team's project. In order to achieve this, we need to identify the suggestions in the review comments to determine how useful a review would be. This would motivate the reviewers to give better and constructive reviews.
  • We would want the instructor of the course to be able to view how many constructive reviews were provided by a reviewer in comparison to the average number of constructive reviews provided by the other reviewers of the course.

Proposed Solutions

  • On the student review page, once a student saves his/her review, a pop can show be shown with the feedback - suggestion scores, problem scores and tone analysis among others.
  • As an instructor, one might be able to check each student's review scores and his performance as compared with the class. These scores along with the metrics will be shown in the review report page under the metrics column.

Flowchart

Given below is the design flowchart of our proposed solution:

 

As a student, once the student finishes (partly or completely) giving his/her review and clicks on the "Save" or "Submit" button, the web service API will be called and the review's text will be sent as JSON. The PeerLogic web service will send back the output which will then be displayed as a pop-up to the student.

All the scores for all the reviews conducted by a student will be stored and the aggregate data will be displayed to the instructor whenever he/she will view the Review Report for any assignment/project. This will be visible in the metrics column of the review report and will be displayed student-wise, i.e., for each student participating in the assignment.

Anticipated Code Changes

Broadly speaking, the following changes will be made:

  • User: Student
 - Adding API calls of the suggestion-detection algorithm in the response_controller.rb
 - Adding pop-up to display summarized analysis for all comments in the review
 - Ensure that CORS does not need to be enabled for API call to work
 - Write unit tests for our method(s) in response_controller.rb
 - Write unit tests for our changes in response.html.erb view
  • User: Instructor
 - Adding method call to display the aggregate analysis of student comments/reviews to reports_controller.rb under Metrics column
 - Write unit tests for our method(s) in reports_controller.rb
 - Write unit tests for our changes in _review_report.html.erb

Test Plan

Below given scenarios is the basic overview of the tests planned on being written.

Automated Testing Using Rspec

We will test our project and added functionality using RSepc. Automated tests are carried out to test -

  1. Whether the APIs are being called when the student clicks "Save" review button.
  2. Whether the APIs are being called when the student clicks "Submit Review" button.
  3. Whether the pop-up displays review comments analysis when student clicks "Save" button.
  4. Whether the pop-up displays review comments analysis when student clicks "Submit Review" button.
  5. Whether the instructor is able to see each student's aggregate review analysis score when he/she views Review Reports.

There can also be other test cases to check for UI elements and control flow tests.

Certain edge cases to look out for:

  1. Instructor should not be shown nil/error values when a student has not submitted any reviews.
  2. Student should not be shown the pop-up on clicking the "back" button or after refreshing the page.

Our Work

Team Information

Mentor
Ed Gehringer (efg@ncsu.edu)
Team Name
Pizzamas
Members
  • Maharshi Parekh (mgparekh)
  • Pururaj Dave (pdave2)
  • Roshani Narasimhan (rnarasi2)
  • Yash Thakkar (yrthakka)

References