CSC/ECE 517 Fall 2019 - E1979. Completion/Progress view

From Expertiza_Wiki
Jump to navigation Jump to search

Introduction

  • In Expertiza, peer reviews are used as a metric to evaluate someone’s project. Once someone has peer reviewed a project, the authors of the project can also provide feedback for this review, called “author feedback.” While grading peer reviews, it would be nice for the instructors to include the author feedback, since it shows how helpful the peer review actually was to the author of the project.

Current Implementation

  • Currently, the instructor can only see several information, including numbers of review done, team the student which have reviewed, about author feedback. The current view report can be shown as below.

  • However, the instructor has no easy way of seeing the author-feedback scores, so it would be far too much trouble to include them in grades for reviewing.
  • So the aim of this project is to build more feedback information into the reviewer report. So that the instructor of the course is able to grade reviewers based on author feedback and review data.

Problem Statement

  • We need to implement the integration of review performance which includes:
  1. # of reviews completed
  2. Length of reviews
  3. [Summary of reviews]
  4. Whether reviewers added a file or link to their review
  5. The average ratings they received from the authors
  6. An interactive visualization or table that showed this would be GREAT (We may use “HighChart” javascript library to do it.)
  • After analysis the current code, we found that the number of reviews, summary of reviews and visualization of length of reviews have already existed in the system. So we only need to finished the following tasks.
  1. Whether reviewers added a file or link to their review
  2. The average ratings they received from the authors
  • As the description of our object, the average ratings part of this project has been done last year. And they add a new column (author feedback) to review report. But their functions still have some drawbacks. So we also need to improve the several functions of author feedback.
  1. Fix code to calculate the average score of feedback
  2. Make the author feedback column adjust to existing view.
  • So here is our plan and solutions to finish this project.

File:Feedbackk task.png

Project Design

The basic design of this project can be shown in the UML flow chart below.

View Improvement

  • We decide mainly change the page of "Review report" (Log in as an instructor then go to Manage -> Assignments -> View review report.) from three aspects.
  1. We are planning to add one more column to show the average ratings for feedback for a student's review of a particular assignment. The logic for calculating the average score for the metareviews would be similar to already implemented logic for the "Score Awarded/Average Score" column.
  2. We are planning to improve the column of review length. Now it is just bare data and average data. We will make the review length into visualized chart by using “HighChart” javascript library. So that length of review will become more clear for instructors. The chart will be shown like below.
  3. We are planning to add all links which attached to their reviews below Team name. If there are links in the review, the links will be shown below.

Controller Improvement

The Controller design is based on the data we need for view. We change following method. Specify changes will be shown in File Change.

  1. calcutate_average_author_feedback_score
  2. list_url_in_comments

Code Changes

Here is the file we have changed. 1. _review_report.html.erb We add a new column to show average score of feedback score.

          <!--Author feedback-->
                <!--Dr.Kidd required to add a "author feedback" column that shows the average score the reviewers received from the authors. In this case, she can see all the data on a single screen.-->
                <!--Dr.Kidd's course-->
          <td align='left'>
           <% @response_maps.each_with_index do |ri, index| %>
           <% if Team.where(id: ri.reviewee_id).length > 0 %>
              <% if index == 0 %>
                  <div>
              <% else %>
                  <div style="border-top: solid; border-width: 1px;">
              <% end %>
                    <%= calcutate_average_author_feedback_score(@assignment.id, @assignment.max_team_size, ri.id, ri.reviewee_id) %>
                  </div>
            <% end %>
          <% end %>
          </td>

Then we use new-added method to show all link below each Team name.

                    <!-- E1979: only list all url(start with http or https) in comments-->
                    <% urls = list_url_in_comments(ri.id)%>
                  </div>
                  <% if !urls.nil? %>
                    <% urls.each_with_index do |url| %>
                      <%= link_to url.to_s, url %>
                     <% end %>
                  <% end %>


2. review_mapping_helper.rb We first fix calcutate_average_author_feedback_score method.

  # E1979: Completion/Progress view changes
  def calcutate_average_author_feedback_score(assignment_id, max_team_size, response_map_id, reviewee_id)
    review_response = ResponseMap.where(id: response_map_id).try(:first).try(:response).try(:last)
    author_feedback_avg_score = "-- / --"
    unless review_response.nil?
      # Total score of author feedback given by all team members. 
      author_feedback_total_score = 0
      # Max score of author feedback
      author_feedback_max_score =  0
      # Number of author feedback given by all team members. 
      author_feedback_count = 0
      # For each user in teamsUser find author feedback score.
      TeamsUser.where(team_id: reviewee_id).try(:each) do |teamsUser|
        user = teamsUser.try(:user)
        author = Participant.where(parent_id: assignment_id, user_id: user.id).try(:first) unless user.nil?
        feedback_response = ResponseMap.where(reviewed_object_id: review_response.id, reviewer_id: author.id).try(:first).try(:response).try(:last) unless author.nil?
        unless feedback_response.nil?
          author_feedback_total_score += feedback_response.total_score
          author_feedback_max_score = feedback_response.maximum_score
          author_feedback_count +=1
        end
      end
      # return "-- / --" if no author feedback, otherwise return avg_score
      author_feedback_avg_score = author_feedback_count == 0 ? "-- / --" : "#{author_feedback_total_score/author_feedback_count} / #{author_feedback_max_score}"
    end
    author_feedback_avg_score
  end

Then we add a new method called list_url_in_comments to shown all urls.

  # E1979: Completion/Progress view changes
  # list all urls in comments for each review map
  def list_url_in_comments(response_map_id)
    response = Response.where(map_id: response_map_id)
    # find all comments in each repsonse
    Answer.where(response_id: response.ids).try(:each) do |ans|
      @comments = ''
      @comments += ans.try(:comments)
    end
    if @comments.nil?
      return @comments
    end
    urls = []
    # get every words in the comment
    words = @comments.split(" ")
    words.each do |element|
      # see if it starts with http or https
      if element =~ /https?:\/\/[\S]+/
        urls.append(element)
      end
    end
    urls
  end

3. review_mapping_helper_spec.rb Finally, we add some tests for the changes made. The specific test cases will be shown in Test Plan.

Test Plan

We will use unit test and functionally test with Rspec, and integration test by using TravisCI and Coveralls bot.

Automated Testing Using Rspec

  1. Controller logic: As we change two methods, we created corresponding tests for them in review_mapping_helper_spec.rb.
  2. View logic: We will add a Rspec test in each controller which tests our new methods, such as have_file_added and have_link_added.

Coverage

The coverage of Coveralls bot will be filled after our finish all our work.

Manual UI Testing

  1. Log in as an instructor
  2. Click Manage and choose Assignments
  3. Choose review report and click view
  4. Check "Average Feedback Score" column
  5. Check urls below each "Team reviewed"

The anticipated result can be same as below.

Team Information

Mentor: Mohit Jain

  • Mentor: Mohit Jain ()
  • Jing Cai ()
  • Yongjian Zhu ()
  • Weiran Fu ()
  • Dongyuan Wang ()

Related Links

  1. Forked GitHub Repository