CSC/ECE 517 Fall 2019 - E1979. Completion/Progress view

From Expertiza_Wiki
Revision as of 21:37, 10 November 2019 by Jcai3 (talk | contribs)
Jump to navigation Jump to search

Introduction

  • In Expertiza, peer reviews are used as a metric to evaluate someone’s project. Once someone has peer reviewed a project, the authors of the project can also provide feedback for this review, called “author feedback.” While grading peer reviews, it would be nice for the instructors to include the author feedback, since it shows how helpful the peer review actually was to the author of the project.

Current Implementation

  • Currently, the instructor can only see several information, including numbers of review done, team the student which have reviewed, about author feedback. The current view report can be shown as below.


  • However, the instructor has no easy way of seeing the author-feedback scores, so it would be far too much trouble to include them in grades for reviewing.
  • So the aim of this project is to build more feedback information into the reviewer report. So that the instructor of the course is able to grade reviewers based on author feedback and review data.

Problem Statement

  • We need to implement the integration of review performance which includes:
  1. # of reviews completed
  2. Length of reviews
  3. [Summary of reviews]
  4. Whether reviewers added a file or link to their review
  5. The average ratings they received from the authors
  6. An interactive visualization or table that showed this would be GREAT (We may use “HighChart” javascript library to do it.)
  • After analysis the current code, we found that the number of reviews, summary of reviews have already existed in the system. So we only need to finished the following tasks.
  1. Length of reviews (we can use HighChart to realize visualization)
  2. Whether reviewers added a file or link to their review
  3. The average ratings they received from the authors
  • As the description of our object, the average ratings part of this project has been done last year. And they add a new column (author feedback) to review report. But their functions still have some drawbacks. So we also need to improve the several functions of author feedback.
  1. Use the exist code to calculate the average score of feedback
  2. Make the author feedback column toggable.
  • So here is our plan and solutions to finish this project.

Project Design

The basic design of this project can be shown in the UML flow chart below.

View Improvement

  • We decide mainly change the page of "Review report" (Log in as an instructor then go to Manage -> Assignments -> View review report.) from three aspects.
  1. We are planning to add one more toggable column to show the ratings and average ratings for feedback for a student's review of a particular assignment. The logic for calculating the average score for the metareviews would be similar to already implemented logic for the "Score Awarded/Average Score" column. The last year project has already finished the function of this column, so we only need to make this column toggable as it is reducing the width of comment section for grading.Below is the page we are planning to edit.
  2. We are planning to improve the column of review length. Now it is just bare data and average data. We will make the review length into visualized chart by using “HighChart” javascript library. So that length of review will become more clear for instructors. The chart will be shown like below.
  3. We are planning to add one more column about whether reviewers added a file or link to their review. If will be a boolean result, something like a checkbox. If the reviewers added a file or link to their review, the ☑️ icon will be shown in the column, otherwise✖️ will be shown.

Controller Improvement

The Controller design is based on the date we need for view. We may need to add some methods to get the data we want. For example,

  1. average_score_feedback
  2. have_file_added
  3. have_link_added

Code Changes

To do...

Test Plan

We will use unit test and functionally test with Rspec, and integration test by using TravisCI and Coveralls bot.

Automated Testing Using Rspec

  1. We plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (feedback, file and chart) exists.
  2. We will add a Rspec test in each controller which tests our new methods, such as have_file_added and have_link_added.

Coverage

The coverage of Coveralls bot will be filled after our finish all our work.

Manual UI Testing

  1. Log in as an instructor
  2. Click Manage and choose Assignments
  3. Choose review report and click view
  4. See whether new fields are shown.

Team Information

Mentor: Mohit Jain

  • Mentor: Mohit Jain ()
  • Jing Cai ()
  • Yongjian Zhu ()
  • Weiran Fu ()
  • Dongyuan Wang ()

Related Links

  1. Forked GitHub Repository