<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.expertiza.ncsu.edu/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Psingh22</id>
	<title>Expertiza_Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.expertiza.ncsu.edu/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Psingh22"/>
	<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=Special:Contributions/Psingh22"/>
	<updated>2026-05-12T06:22:37Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.41.0</generator>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119787</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119787"/>
		<updated>2018-11-13T22:38:46Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* Files that will be changed */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
A key component of Expertiza is peer reviews, which provide feedback to authors so that they can improve their work. Expertiza also supports grading of these reviews to ensure students write quality reviews, helping them learn more about the assignment by looking at their peers' work. In addition, Expertiza allows for metareviews, which are reviews the authors of the original work write for the reviews of their original work. This author feedback is useful for grading the reviews because it indicates how helpful this review was to the authors of the original work. The objective of this project is to add metareview or author feedback information to the review report page, which shows a summary of all the reviews written by the students for an assignment.&lt;br /&gt;
&lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== User Interface Enhancements ==&lt;br /&gt;
&lt;br /&gt;
In the page &amp;quot;Review report for Design exercise&amp;quot; (Log in as an instructor then go to Manage -&amp;gt; Assignments -&amp;gt; View review report.), we are planning to add one more column to show the average ratings for the author feedback for a student's review of a particular assignment. The logic for calculating the average score for the metareviews would be similar to already implemented logic for the &amp;quot;Score Awarded/Average Score&amp;quot; column. Below is the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Controller-level Logic ==&lt;br /&gt;
&lt;br /&gt;
The following method shows the code logic we are planning to write for calculating the average scores for the feedback given by authors for the reviews of their work.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre style=&amp;quot;color: black; border:1px;&amp;quot;&amp;gt;&lt;br /&gt;
 def calculate_avg_score_by_feedback(question_answers, q_max_score)&lt;br /&gt;
      # get score and summary of answers for each question&lt;br /&gt;
      # only include divide the valid_answer_sum with the number of valid answers&lt;br /&gt;
&lt;br /&gt;
      valid_answer_counter = 0&lt;br /&gt;
      question_score = 0.0&lt;br /&gt;
      question_answers.each do |ans|&lt;br /&gt;
        # calculate score per question&lt;br /&gt;
        unless ans.answer.nil?&lt;br /&gt;
          question_score += ans.answer&lt;br /&gt;
          valid_answer_counter += 1&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      if valid_answer_counter &amp;gt; 0 and q_max_score &amp;gt; 0&lt;br /&gt;
        # convert the score in percentage&lt;br /&gt;
        question_score /= (valid_answer_counter * q_max_score)&lt;br /&gt;
        question_score = question_score.round(2) * 100&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      question_score&lt;br /&gt;
    end&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Relevant Database Tables ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for this feature. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire are saved in the Answers table below based on the Question ID. Now, in order to know if the answer is a feedback by team members or a review by reviewer, the mapping for the Answers table is done by the response_id which is a foreign key to the Response table. This Response table gives us the map_id which maps to a response map table. Now, the response map table gives us information on the reviewer_id, reviewee_id, reviewed_object_id (which is the ID for the assignment being reviewed) and the type (whether it's a teammate review, author feedback, a regular review, etc.). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a previous reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions for the corresponding ReviewScores table. Below are excerpts from the [http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables Expertiza database documentation] which describe the database tables relevant to our design.&lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Files that will be changed ==&lt;br /&gt;
&lt;br /&gt;
1. To calculate average author feedback score: https://github.com/jainmohit1/expertiza/blob/master/app/models/on_the_fly_calc.rb&lt;br /&gt;
&lt;br /&gt;
2. To populate average author feedback score for the view: the https://github.com/jainmohit1/expertiza/blob/master/app/controllers/review_mapping_controller.rb&lt;br /&gt;
&lt;br /&gt;
3. To add a field in the view: https://github.com/jainmohit1/expertiza/blob/master/app/views/review_mapping/response_report.html.haml&lt;br /&gt;
&lt;br /&gt;
4. To add a field in the partial : https://github.com/jainmohit1/expertiza/blob/master/app/views/review_mapping/_review_report.html.erb&lt;br /&gt;
&lt;br /&gt;
5. To add a field in the partial:  https://github.com/jainmohit1/expertiza/blob/master/app/views/review_mapping/_team_score.html.erb&lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
We plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (average author feedback) exists.&lt;br /&gt;
&lt;br /&gt;
Using [http://rspec.info/ RSpec] we will add a test case to review_mapping_controller_spec.rb.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has author feedback feature' do&lt;br /&gt;
        it 'renders response_report page with average author feedback data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
We also plan to manually test the response report page to make sure the new field is aligning well in the UI in the expected place. We will attach the screenshot of the UI as the test result. We will test the cases of one and multiple reviews by a reviewer and verify the number and average scores of the metareviews for those reviews are rendered correctly.&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1. http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;br /&gt;
&lt;br /&gt;
2. https://github.com/jainmohit1/expertiza&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119785</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119785"/>
		<updated>2018-11-13T22:36:05Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* Test Plan */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
A key component of Expertiza is peer reviews, which provide feedback to authors so that they can improve their work. Expertiza also supports grading of these reviews to ensure students write quality reviews, helping them learn more about the assignment by looking at their peers' work. In addition, Expertiza allows for metareviews, which are reviews the authors of the original work write for the reviews of their original work. This author feedback is useful for grading the reviews because it indicates how helpful this review was to the authors of the original work. The objective of this project is to add metareview or author feedback information to the review report page, which shows a summary of all the reviews written by the students for an assignment.&lt;br /&gt;
&lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== User Interface Enhancements ==&lt;br /&gt;
&lt;br /&gt;
In the page &amp;quot;Review report for Design exercise&amp;quot; (Log in as an instructor then go to Manage -&amp;gt; Assignments -&amp;gt; View review report.), we are planning to add one more column to show the average ratings for the author feedback for a student's review of a particular assignment. The logic for calculating the average score for the metareviews would be similar to already implemented logic for the &amp;quot;Score Awarded/Average Score&amp;quot; column. Below is the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Controller-level Logic ==&lt;br /&gt;
&lt;br /&gt;
The following method shows the code logic we are planning to write for calculating the average scores for the feedback given by authors for the reviews of their work.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre style=&amp;quot;color: black; border:1px;&amp;quot;&amp;gt;&lt;br /&gt;
 def calculate_avg_score_by_criterion(question_answers, q_max_score)&lt;br /&gt;
      # get score and summary of answers for each question&lt;br /&gt;
      # only include divide the valid_answer_sum with the number of valid answers&lt;br /&gt;
&lt;br /&gt;
      valid_answer_counter = 0&lt;br /&gt;
      question_score = 0.0&lt;br /&gt;
      question_answers.each do |ans|&lt;br /&gt;
        # calculate score per question&lt;br /&gt;
        unless ans.answer.nil?&lt;br /&gt;
          question_score += ans.answer&lt;br /&gt;
          valid_answer_counter += 1&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      if valid_answer_counter &amp;gt; 0 and q_max_score &amp;gt; 0&lt;br /&gt;
        # convert the score in percentage&lt;br /&gt;
        question_score /= (valid_answer_counter * q_max_score)&lt;br /&gt;
        question_score = question_score.round(2) * 100&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      question_score&lt;br /&gt;
    end&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Relevant Database Tables ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for this feature. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire are saved in the Answers table below based on the Question ID. Now, in order to know if the answer is a feedback by team members or a review by reviewer, the mapping for the Answers table is done by the response_id which is a foreign key to the Response table. This Response table gives us the map_id which maps to a response map table. Now, the response map table gives us information on the reviewer_id, reviewee_id, reviewed_object_id (which is the ID for the assignment being reviewed) and the type (whether it's a teammate review, author feedback, a regular review, etc.). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a previous reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions for the corresponding ReviewScores table. Below are excerpts from the [http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables Expertiza database documentation] which describe the database tables relevant to our design.&lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Files that will be changed ==&lt;br /&gt;
&lt;br /&gt;
1. https://github.com/jainmohit1/expertiza/blob/master/app/models/on_the_fly_calc.rb&lt;br /&gt;
&lt;br /&gt;
2. https://github.com/jainmohit1/expertiza/blob/master/app/controllers/review_mapping_controller.rb&lt;br /&gt;
&lt;br /&gt;
3. https://github.com/jainmohit1/expertiza/blob/master/app/views/review_mapping/response_report.html.haml&lt;br /&gt;
&lt;br /&gt;
4. https://github.com/jainmohit1/expertiza/blob/master/app/views/review_mapping/_review_report.html.erb&lt;br /&gt;
&lt;br /&gt;
5. https://github.com/jainmohit1/expertiza/blob/master/app/views/review_mapping/_team_score.html.erb&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
We plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (average author feedback) exists.&lt;br /&gt;
&lt;br /&gt;
Using [http://rspec.info/ RSpec] we will add a test case to review_mapping_controller_spec.rb.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has author feedback feature' do&lt;br /&gt;
        it 'renders response_report page with average author feedback data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
We also plan to manually test the response report page to make sure the new field is aligning well in the UI in the expected place. We will attach the screenshot of the UI as the test result. We will test the cases of one and multiple reviews by a reviewer and verify the number and average scores of the metareviews for those reviews are rendered correctly.&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1. http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;br /&gt;
&lt;br /&gt;
2. https://github.com/jainmohit1/expertiza&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119784</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119784"/>
		<updated>2018-11-13T22:35:50Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* Files that will be changed */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
A key component of Expertiza is peer reviews, which provide feedback to authors so that they can improve their work. Expertiza also supports grading of these reviews to ensure students write quality reviews, helping them learn more about the assignment by looking at their peers' work. In addition, Expertiza allows for metareviews, which are reviews the authors of the original work write for the reviews of their original work. This author feedback is useful for grading the reviews because it indicates how helpful this review was to the authors of the original work. The objective of this project is to add metareview or author feedback information to the review report page, which shows a summary of all the reviews written by the students for an assignment.&lt;br /&gt;
&lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== User Interface Enhancements ==&lt;br /&gt;
&lt;br /&gt;
In the page &amp;quot;Review report for Design exercise&amp;quot; (Log in as an instructor then go to Manage -&amp;gt; Assignments -&amp;gt; View review report.), we are planning to add one more column to show the average ratings for the author feedback for a student's review of a particular assignment. The logic for calculating the average score for the metareviews would be similar to already implemented logic for the &amp;quot;Score Awarded/Average Score&amp;quot; column. Below is the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Controller-level Logic ==&lt;br /&gt;
&lt;br /&gt;
The following method shows the code logic we are planning to write for calculating the average scores for the feedback given by authors for the reviews of their work.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre style=&amp;quot;color: black; border:1px;&amp;quot;&amp;gt;&lt;br /&gt;
 def calculate_avg_score_by_criterion(question_answers, q_max_score)&lt;br /&gt;
      # get score and summary of answers for each question&lt;br /&gt;
      # only include divide the valid_answer_sum with the number of valid answers&lt;br /&gt;
&lt;br /&gt;
      valid_answer_counter = 0&lt;br /&gt;
      question_score = 0.0&lt;br /&gt;
      question_answers.each do |ans|&lt;br /&gt;
        # calculate score per question&lt;br /&gt;
        unless ans.answer.nil?&lt;br /&gt;
          question_score += ans.answer&lt;br /&gt;
          valid_answer_counter += 1&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      if valid_answer_counter &amp;gt; 0 and q_max_score &amp;gt; 0&lt;br /&gt;
        # convert the score in percentage&lt;br /&gt;
        question_score /= (valid_answer_counter * q_max_score)&lt;br /&gt;
        question_score = question_score.round(2) * 100&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      question_score&lt;br /&gt;
    end&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Relevant Database Tables ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for this feature. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire are saved in the Answers table below based on the Question ID. Now, in order to know if the answer is a feedback by team members or a review by reviewer, the mapping for the Answers table is done by the response_id which is a foreign key to the Response table. This Response table gives us the map_id which maps to a response map table. Now, the response map table gives us information on the reviewer_id, reviewee_id, reviewed_object_id (which is the ID for the assignment being reviewed) and the type (whether it's a teammate review, author feedback, a regular review, etc.). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a previous reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions for the corresponding ReviewScores table. Below are excerpts from the [http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables Expertiza database documentation] which describe the database tables relevant to our design.&lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
We plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (average author feedback) exists.&lt;br /&gt;
&lt;br /&gt;
Using [http://rspec.info/ RSpec] we will add a test case to review_mapping_controller_spec.rb.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has author feedback feature' do&lt;br /&gt;
        it 'renders response_report page with average author feedback data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
We also plan to manually test the response report page to make sure the new field is aligning well in the UI in the expected place. We will attach the screenshot of the UI as the test result. We will test the cases of one and multiple reviews by a reviewer and verify the number and average scores of the metareviews for those reviews are rendered correctly.&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1. http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;br /&gt;
&lt;br /&gt;
2. https://github.com/jainmohit1/expertiza&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119783</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119783"/>
		<updated>2018-11-13T22:35:35Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* User Interface Enhancements */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
A key component of Expertiza is peer reviews, which provide feedback to authors so that they can improve their work. Expertiza also supports grading of these reviews to ensure students write quality reviews, helping them learn more about the assignment by looking at their peers' work. In addition, Expertiza allows for metareviews, which are reviews the authors of the original work write for the reviews of their original work. This author feedback is useful for grading the reviews because it indicates how helpful this review was to the authors of the original work. The objective of this project is to add metareview or author feedback information to the review report page, which shows a summary of all the reviews written by the students for an assignment.&lt;br /&gt;
&lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== User Interface Enhancements ==&lt;br /&gt;
&lt;br /&gt;
In the page &amp;quot;Review report for Design exercise&amp;quot; (Log in as an instructor then go to Manage -&amp;gt; Assignments -&amp;gt; View review report.), we are planning to add one more column to show the average ratings for the author feedback for a student's review of a particular assignment. The logic for calculating the average score for the metareviews would be similar to already implemented logic for the &amp;quot;Score Awarded/Average Score&amp;quot; column. Below is the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Files that will be changed ==&lt;br /&gt;
&lt;br /&gt;
1. https://github.com/jainmohit1/expertiza/blob/master/app/models/on_the_fly_calc.rb&lt;br /&gt;
&lt;br /&gt;
2. https://github.com/jainmohit1/expertiza/blob/master/app/controllers/review_mapping_controller.rb&lt;br /&gt;
&lt;br /&gt;
3. https://github.com/jainmohit1/expertiza/blob/master/app/views/review_mapping/response_report.html.haml&lt;br /&gt;
&lt;br /&gt;
4. https://github.com/jainmohit1/expertiza/blob/master/app/views/review_mapping/_review_report.html.erb&lt;br /&gt;
&lt;br /&gt;
5. https://github.com/jainmohit1/expertiza/blob/master/app/views/review_mapping/_team_score.html.erb&lt;br /&gt;
&lt;br /&gt;
== Controller-level Logic ==&lt;br /&gt;
&lt;br /&gt;
The following method shows the code logic we are planning to write for calculating the average scores for the feedback given by authors for the reviews of their work.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre style=&amp;quot;color: black; border:1px;&amp;quot;&amp;gt;&lt;br /&gt;
 def calculate_avg_score_by_criterion(question_answers, q_max_score)&lt;br /&gt;
      # get score and summary of answers for each question&lt;br /&gt;
      # only include divide the valid_answer_sum with the number of valid answers&lt;br /&gt;
&lt;br /&gt;
      valid_answer_counter = 0&lt;br /&gt;
      question_score = 0.0&lt;br /&gt;
      question_answers.each do |ans|&lt;br /&gt;
        # calculate score per question&lt;br /&gt;
        unless ans.answer.nil?&lt;br /&gt;
          question_score += ans.answer&lt;br /&gt;
          valid_answer_counter += 1&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      if valid_answer_counter &amp;gt; 0 and q_max_score &amp;gt; 0&lt;br /&gt;
        # convert the score in percentage&lt;br /&gt;
        question_score /= (valid_answer_counter * q_max_score)&lt;br /&gt;
        question_score = question_score.round(2) * 100&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      question_score&lt;br /&gt;
    end&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Relevant Database Tables ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for this feature. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire are saved in the Answers table below based on the Question ID. Now, in order to know if the answer is a feedback by team members or a review by reviewer, the mapping for the Answers table is done by the response_id which is a foreign key to the Response table. This Response table gives us the map_id which maps to a response map table. Now, the response map table gives us information on the reviewer_id, reviewee_id, reviewed_object_id (which is the ID for the assignment being reviewed) and the type (whether it's a teammate review, author feedback, a regular review, etc.). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a previous reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions for the corresponding ReviewScores table. Below are excerpts from the [http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables Expertiza database documentation] which describe the database tables relevant to our design.&lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
We plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (average author feedback) exists.&lt;br /&gt;
&lt;br /&gt;
Using [http://rspec.info/ RSpec] we will add a test case to review_mapping_controller_spec.rb.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has author feedback feature' do&lt;br /&gt;
        it 'renders response_report page with average author feedback data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
We also plan to manually test the response report page to make sure the new field is aligning well in the UI in the expected place. We will attach the screenshot of the UI as the test result. We will test the cases of one and multiple reviews by a reviewer and verify the number and average scores of the metareviews for those reviews are rendered correctly.&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1. http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;br /&gt;
&lt;br /&gt;
2. https://github.com/jainmohit1/expertiza&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119775</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119775"/>
		<updated>2018-11-13T22:23:56Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* Test Plan */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
A key component of Expertiza is peer reviews, which provide feedback to authors so that they can improve their work. Expertiza also supports grading of these reviews to ensure students write quality reviews, helping them learn more about the assignment by looking at their peers' work. In addition, Expertiza allows for metareviews, which are reviews the authors of the original work write for the reviews of their original work. This author feedback is useful for grading the reviews because it indicates how helpful this review was to the authors of the original work. The objective of this project is to add metareview or author feedback information to the review report page, which shows a summary of all the reviews written by the students for an assignment.&lt;br /&gt;
&lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== User Interface Enhancements ==&lt;br /&gt;
&lt;br /&gt;
In the page &amp;quot;Review report for Design exercise&amp;quot; (Log in as an instructor then go to Manage -&amp;gt; Assignments -&amp;gt; View review report.), we are planning to add one more column to show the average ratings for the author feedback for a student's review of a particular assignment. The logic for calculating the average score for the metareviews would be similar to already implemented logic for the &amp;quot;Score Awarded/Average Score&amp;quot; column. Below is the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
== Controller-level Logic ==&lt;br /&gt;
&lt;br /&gt;
The following method shows the code logic we are planning to write for calculating the average scores for the feedback given by authors for the reviews of their work.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre style=&amp;quot;color: black; border:1px;&amp;quot;&amp;gt;&lt;br /&gt;
 def calculate_avg_score_by_criterion(question_answers, q_max_score)&lt;br /&gt;
      # get score and summary of answers for each question&lt;br /&gt;
      # only include divide the valid_answer_sum with the number of valid answers&lt;br /&gt;
&lt;br /&gt;
      valid_answer_counter = 0&lt;br /&gt;
      question_score = 0.0&lt;br /&gt;
      question_answers.each do |ans|&lt;br /&gt;
        # calculate score per question&lt;br /&gt;
        unless ans.answer.nil?&lt;br /&gt;
          question_score += ans.answer&lt;br /&gt;
          valid_answer_counter += 1&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      if valid_answer_counter &amp;gt; 0 and q_max_score &amp;gt; 0&lt;br /&gt;
        # convert the score in percentage&lt;br /&gt;
        question_score /= (valid_answer_counter * q_max_score)&lt;br /&gt;
        question_score = question_score.round(2) * 100&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      question_score&lt;br /&gt;
    end&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Relevant Database Tables ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for this feature. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire are saved in the Answers table below based on the Question ID. Now, in order to know if the answer is a feedback by team members or a review by reviewer, the mapping for the Answers table is done by the response_id which is a foreign key to the Response table. This Response table gives us the map_id which maps to a response map table. Now, the response map table gives us information on the reviewer_id, reviewee_id, reviewed_object_id (which is the ID for the assignment being reviewed) and the type (whether it's a teammate review, author feedback, a regular review, etc.). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a previous reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions for the corresponding ReviewScores table. Below are excerpts from the [http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables Expertiza database documentation] which describe the database tables relevant to our design.&lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
* Plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (avg author feedback) exists&lt;br /&gt;
** Using rspec we will add a test case to review_mapping_controller_spec.rb&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has author feedback feature' do&lt;br /&gt;
        it 'renders response_report page with average author feedback data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Also, we plan to manually test the response report page to make sure the new field is aligning well in the UI in the expected place. We will attach the screenshot of the UI as the test result.&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1. http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;br /&gt;
&lt;br /&gt;
2. https://github.com/jainmohit1/expertiza&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119766</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119766"/>
		<updated>2018-11-13T21:30:58Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
In Expertiza, peer reviews are used as a metric to evaluate someone’s project. Once someone has peer reviewed a project, the authors of the project can also provide a feedback for this review in terms of ‘Author feedback’. While grading peer reviews, it would be nice for the instructors to take into account the author feedbacks given on a particular peer review, this will be helpful in evaluating how helpful the peer review actually was to the author of the project.&lt;br /&gt;
 &lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== Database ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for mapping. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire is saved in Answers table below based on Question ID. Now, in order to know if the answers is a feedback by team members or a review by reviewer, the mapping for Answers table is done by response_id which is a foreign key to response table. Response table gives us map_id which maps to Response Maps table. Now, Response Map table gives us information of the reviewer_id, reviewee_id, reviewed_object_id (which is the id for the assignment being reviewed) and the type (whether it's a teammate review, author feedback or a regular review). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions from Review_Scores table. &lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== UI Implementation ==&lt;br /&gt;
&lt;br /&gt;
In the page Review report for Design exercise ( login as an instructor -&amp;gt; Manage -&amp;gt; Assignments -&amp;gt; View review report ), we are planning to add one more column to show the average ratings for the authors feedback on a particular assignment. The logic for calculating the average score for the feedback would be similar to already implemented logic for score awarded/ average score column. Below attached shows the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
== Code Logic ==&lt;br /&gt;
&lt;br /&gt;
Following shows the code logic we are planning to write for calculating the avg scores for the feedback given by authors.&lt;br /&gt;
&amp;lt;pre style=&amp;quot;color: black; border:1px;&amp;quot;&amp;gt;&lt;br /&gt;
 def calculate_avg_score_by_criterion(question_answers, q_max_score)&lt;br /&gt;
      # get score and summary of answers for each question&lt;br /&gt;
      # only include divide the valid_answer_sum with the number of valid answers&lt;br /&gt;
&lt;br /&gt;
      valid_answer_counter = 0&lt;br /&gt;
      question_score = 0.0&lt;br /&gt;
      question_answers.each do |ans|&lt;br /&gt;
        # calculate score per question&lt;br /&gt;
        unless ans.answer.nil?&lt;br /&gt;
          question_score += ans.answer&lt;br /&gt;
          valid_answer_counter += 1&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      if valid_answer_counter &amp;gt; 0 and q_max_score &amp;gt; 0&lt;br /&gt;
        # convert the score in percentage&lt;br /&gt;
        question_score /= (valid_answer_counter * q_max_score)&lt;br /&gt;
        question_score = question_score.round(2) * 100&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      question_score&lt;br /&gt;
    end&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
* Plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (avg author feedback) exists&lt;br /&gt;
** Using rspec we will add a test case to ReviewMappingControllerSpec.rb&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has author feedback feature' do&lt;br /&gt;
        it 'renders response_report page with average author feedback data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Also, we plan to manually test the response report page to make sure the new field is aligning well in the UI in the expected place. We will attach the screenshot of the UI as the test result.&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1) http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;br /&gt;
&lt;br /&gt;
2) https://github.com/jainmohit1/expertiza&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119765</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119765"/>
		<updated>2018-11-13T21:30:37Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
In Expertiza, peer reviews are used as a metric to evaluate someone’s project. Once someone has peer reviewed a project, the authors of the project can also provide a feedback for this review in terms of ‘Author feedback’. While grading peer reviews, it would be nice for the instructors to take into account the author feedbacks given on a particular peer review, this will be helpful in evaluating how helpful the peer review actually was to the author of the project.&lt;br /&gt;
 &lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== Database ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for mapping. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire is saved in Answers table below based on Question ID. Now, in order to know if the answers is a feedback by team members or a review by reviewer, the mapping for Answers table is done by response_id which is a foreign key to response table. Response table gives us map_id which maps to Response Maps table. Now, Response Map table gives us information of the reviewer_id, reviewee_id, reviewed_object_id (which is the id for the assignment being reviewed) and the type (whether it's a teammate review, author feedback or a regular review). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions from Review_Scores table. &lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== UI Implementation ==&lt;br /&gt;
&lt;br /&gt;
In the page Review report for Design exercise ( login as an instructor -&amp;gt; Manage -&amp;gt; Assignments -&amp;gt; View review report ), we are planning to add one more column to show the average ratings for the authors feedback on a particular assignment. The logic for calculating the average score for the feedback would be similar to already implemented logic for score awarded/ average score column. Below attached shows the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
== Code Logic ==&lt;br /&gt;
&lt;br /&gt;
Following shows the code logic we are planning to write for calculating the avg scores for the feedback given by authors.&lt;br /&gt;
&amp;lt;pre style=&amp;quot;color: black; border:1px;&amp;quot;&amp;gt;&lt;br /&gt;
 def calculate_avg_score_by_criterion(question_answers, q_max_score)&lt;br /&gt;
      # get score and summary of answers for each question&lt;br /&gt;
      # only include divide the valid_answer_sum with the number of valid answers&lt;br /&gt;
&lt;br /&gt;
      valid_answer_counter = 0&lt;br /&gt;
      question_score = 0.0&lt;br /&gt;
      question_answers.each do |ans|&lt;br /&gt;
        # calculate score per question&lt;br /&gt;
        unless ans.answer.nil?&lt;br /&gt;
          question_score += ans.answer&lt;br /&gt;
          valid_answer_counter += 1&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      if valid_answer_counter &amp;gt; 0 and q_max_score &amp;gt; 0&lt;br /&gt;
        # convert the score in percentage&lt;br /&gt;
        question_score /= (valid_answer_counter * q_max_score)&lt;br /&gt;
        question_score = question_score.round(2) * 100&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      question_score&lt;br /&gt;
    end&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
* Plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (avg author feedback) exists&lt;br /&gt;
** Using rspec we will add a test case to ReviewMappingControllerSpec.rb&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has author feedback feature' do&lt;br /&gt;
        it 'renders response_report page with average author feedback data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Also, we plan to manually test the response report page to make sure the new field is aligning well in the UI in the expected place. We will attach the screenshot of the UI as the test result.&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1) http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;br /&gt;
1) https://github.com/jainmohit1/expertiza&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119760</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119760"/>
		<updated>2018-11-13T21:07:31Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* Test Plan */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
In Expertiza, peer reviews are used as a metric to evaluate someone’s project. Once someone has peer reviewed a project, the authors of the project can also provide a feedback for this review in terms of ‘Author feedback’. While grading peer reviews, it would be nice for the instructors to take into account the author feedbacks given on a particular peer review, this will be helpful in evaluating how helpful the peer review actually was to the author of the project.&lt;br /&gt;
 &lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== Database ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for mapping. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire is saved in Answers table below based on Question ID. Now, in order to know if the answers is a feedback by team members or a review by reviewer, the mapping for Answers table is done by response_id which is a foreign key to response table. Response table gives us map_id which maps to Response Maps table. Now, Response Map table gives us information of the reviewer_id, reviewee_id, reviewed_object_id (which is the id for the assignment being reviewed) and the type (whether it's a teammate review, author feedback or a regular review). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions from Review_Scores table. &lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== UI Implementation ==&lt;br /&gt;
&lt;br /&gt;
In the page Review report for Design exercise ( login as an instructor -&amp;gt; Manage -&amp;gt; Assignments -&amp;gt; View review report ), we are planning to add one more column to show the average ratings for the authors feedback on a particular assignment. The logic for calculating the average score for the feedback would be similar to already implemented logic for score awarded/ average score column. Below attached shows the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
== Code Logic ==&lt;br /&gt;
&lt;br /&gt;
Following shows the code logic we are planning to write for calculating the avg scores for the feedback given by authors.&lt;br /&gt;
&amp;lt;pre style=&amp;quot;color: black; border:1px;&amp;quot;&amp;gt;&lt;br /&gt;
 def calculate_avg_score_by_criterion(question_answers, q_max_score)&lt;br /&gt;
      # get score and summary of answers for each question&lt;br /&gt;
      # only include divide the valid_answer_sum with the number of valid answers&lt;br /&gt;
&lt;br /&gt;
      valid_answer_counter = 0&lt;br /&gt;
      question_score = 0.0&lt;br /&gt;
      question_answers.each do |ans|&lt;br /&gt;
        # calculate score per question&lt;br /&gt;
        unless ans.answer.nil?&lt;br /&gt;
          question_score += ans.answer&lt;br /&gt;
          valid_answer_counter += 1&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      if valid_answer_counter &amp;gt; 0 and q_max_score &amp;gt; 0&lt;br /&gt;
        # convert the score in percentage&lt;br /&gt;
        question_score /= (valid_answer_counter * q_max_score)&lt;br /&gt;
        question_score = question_score.round(2) * 100&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      question_score&lt;br /&gt;
    end&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
* Plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (avg author feedback) exists&lt;br /&gt;
** Using rspec we will add a test case to ReviewMappingControllerSpec.rb&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has author feedback feature' do&lt;br /&gt;
        it 'renders response_report page with average author feedback data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Also, we plan to manually test the response report page to make sure the new field is aligning well in the UI in the expected place. We will attach the screenshot of the UI as the test result.&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1) http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119759</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119759"/>
		<updated>2018-11-13T21:06:47Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* Test Plan */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
In Expertiza, peer reviews are used as a metric to evaluate someone’s project. Once someone has peer reviewed a project, the authors of the project can also provide a feedback for this review in terms of ‘Author feedback’. While grading peer reviews, it would be nice for the instructors to take into account the author feedbacks given on a particular peer review, this will be helpful in evaluating how helpful the peer review actually was to the author of the project.&lt;br /&gt;
 &lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== Database ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for mapping. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire is saved in Answers table below based on Question ID. Now, in order to know if the answers is a feedback by team members or a review by reviewer, the mapping for Answers table is done by response_id which is a foreign key to response table. Response table gives us map_id which maps to Response Maps table. Now, Response Map table gives us information of the reviewer_id, reviewee_id, reviewed_object_id (which is the id for the assignment being reviewed) and the type (whether it's a teammate review, author feedback or a regular review). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions from Review_Scores table. &lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== UI Implementation ==&lt;br /&gt;
&lt;br /&gt;
In the page Review report for Design exercise ( login as an instructor -&amp;gt; Manage -&amp;gt; Assignments -&amp;gt; View review report ), we are planning to add one more column to show the average ratings for the authors feedback on a particular assignment. The logic for calculating the average score for the feedback would be similar to already implemented logic for score awarded/ average score column. Below attached shows the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
== Code Logic ==&lt;br /&gt;
&lt;br /&gt;
Following shows the code logic we are planning to write for calculating the avg scores for the feedback given by authors.&lt;br /&gt;
&amp;lt;pre style=&amp;quot;color: black; border:1px;&amp;quot;&amp;gt;&lt;br /&gt;
 def calculate_avg_score_by_criterion(question_answers, q_max_score)&lt;br /&gt;
      # get score and summary of answers for each question&lt;br /&gt;
      # only include divide the valid_answer_sum with the number of valid answers&lt;br /&gt;
&lt;br /&gt;
      valid_answer_counter = 0&lt;br /&gt;
      question_score = 0.0&lt;br /&gt;
      question_answers.each do |ans|&lt;br /&gt;
        # calculate score per question&lt;br /&gt;
        unless ans.answer.nil?&lt;br /&gt;
          question_score += ans.answer&lt;br /&gt;
          valid_answer_counter += 1&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      if valid_answer_counter &amp;gt; 0 and q_max_score &amp;gt; 0&lt;br /&gt;
        # convert the score in percentage&lt;br /&gt;
        question_score /= (valid_answer_counter * q_max_score)&lt;br /&gt;
        question_score = question_score.round(2) * 100&lt;br /&gt;
      end&lt;br /&gt;
&lt;br /&gt;
      question_score&lt;br /&gt;
    end&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
* Plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (avg author feedback) exists&lt;br /&gt;
** Using rspec we will add a test case to ReviewMappingControllerSpec.rb&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has author feedback feature' do&lt;br /&gt;
        it 'renders response_report page with average author feedback data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Also, we plan to manually test the response report page to make sure the new field is alighning well in the UI in the expected place&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1) http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119757</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119757"/>
		<updated>2018-11-13T21:06:23Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* Test Plan */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
In Expertiza, peer reviews are used as a metric to evaluate someone’s project. Once someone has peer reviewed a project, the authors of the project can also provide a feedback for this review in terms of ‘Author feedback’. While grading peer reviews, it would be nice for the instructors to take into account the author feedbacks given on a particular peer review, this will be helpful in evaluating how helpful the peer review actually was to the author of the project.&lt;br /&gt;
 &lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== Database ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for mapping. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire is saved in Answers table below based on Question ID. Now, in order to know if the answers is a feedback by team members or a review by reviewer, the mapping for Answers table is done by response_id which is a foreign key to response table. Response table gives us map_id which maps to Response Maps table. Now, Response Map table gives us information of the reviewer_id, reviewee_id, reviewed_object_id (which is the id for the assignment being reviewed) and the type (whether it's a teammate review, author feedback or a regular review). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions from Review_Scores table. &lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== UI Implementation ==&lt;br /&gt;
&lt;br /&gt;
In the page Review report for Design exercise ( login as an instructor -&amp;gt; Manage -&amp;gt; Assignments -&amp;gt; View review report ), we are planning to add one more column to show the average ratings for the authors feedback on a particular assignment. The logic for calculating the average score for the feedback would be similar to already implemented logic for score awarded/ average score column. Below attached shows the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
== Code Logic ==&lt;br /&gt;
&lt;br /&gt;
Following shows the code logic we are planning to write for calculating the avg scores for feedback given by authors. &lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
* Plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (avg author feedback) exists&lt;br /&gt;
** Using rspec we will add a test case to ReviewMappingControllerSpec.rb&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has author feedback feature' do&lt;br /&gt;
        it 'renders response_report page with average author feedback data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Also, we plan to test the response report page to make sure the new field is alighning well in the UI in the expected place&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1) http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119755</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119755"/>
		<updated>2018-11-13T21:02:55Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* Test Plan */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
In Expertiza, peer reviews are used as a metric to evaluate someone’s project. Once someone has peer reviewed a project, the authors of the project can also provide a feedback for this review in terms of ‘Author feedback’. While grading peer reviews, it would be nice for the instructors to take into account the author feedbacks given on a particular peer review, this will be helpful in evaluating how helpful the peer review actually was to the author of the project.&lt;br /&gt;
 &lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== Database ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for mapping. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire is saved in Answers table below based on Question ID. Now, in order to know if the answers is a feedback by team members or a review by reviewer, the mapping for Answers table is done by response_id which is a foreign key to response table. Response table gives us map_id which maps to Response Maps table. Now, Response Map table gives us information of the reviewer_id, reviewee_id, reviewed_object_id (which is the id for the assignment being reviewed) and the type (whether it's a teammate review, author feedback or a regular review). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions from Review_Scores table. &lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== UI Implementation ==&lt;br /&gt;
&lt;br /&gt;
In the page Review report for Design exercise ( login as an instructor -&amp;gt; Manage -&amp;gt; Assignments -&amp;gt; View review report ), we are planning to add one more column to show the average ratings for the authors feedback on a particular assignment. The logic for calculating the average score for the feedback would be similar to already implemented logic for score awarded/ average score column. Below attached shows the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
== Code Logic ==&lt;br /&gt;
&lt;br /&gt;
Following shows the code logic we are planning to write for calculating the avg scores for feedback given by authors. &lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
* Plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (avg author feedback) exists&lt;br /&gt;
** Using rspec we will add a test case to ReviewMappingControllerSpec.rb&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has varying_rubrics_by_round feature' do&lt;br /&gt;
        it 'renders response_report page with corresponding data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Also, we plan to test the response report page to make sure the new field is alighning well in the UI in the expected place&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1) http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119754</id>
		<title>CSC/ECE 517 Fall 2018/E1876 Completion/Progress view</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view&amp;diff=119754"/>
		<updated>2018-11-13T21:02:38Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: /* Test Plan */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;='''Problem Statement'''=&lt;br /&gt;
In Expertiza, peer reviews are used as a metric to evaluate someone’s project. Once someone has peer reviewed a project, the authors of the project can also provide a feedback for this review in terms of ‘Author feedback’. While grading peer reviews, it would be nice for the instructors to take into account the author feedbacks given on a particular peer review, this will be helpful in evaluating how helpful the peer review actually was to the author of the project.&lt;br /&gt;
 &lt;br /&gt;
='''Goal'''=&lt;br /&gt;
&lt;br /&gt;
The aim of this project is to build this into the system. We need an additional column in the 'Review Report' page for reviews which shows the calculation of the author feedback. This will help instructor's to know how the reviews proved useful to the authors/team. The aim of this project is to integrate the author feedback column in the summary page&lt;br /&gt;
&lt;br /&gt;
='''Design'''=&lt;br /&gt;
&lt;br /&gt;
== Database ==&lt;br /&gt;
&lt;br /&gt;
The following are the table structures we will need for mapping. First, the questions table has all the questions based on the questionnaire. We will be only concerned with the questions in the feedback questionnaire. The answers for each question in the feedback questionnaire is saved in Answers table below based on Question ID. Now, in order to know if the answers is a feedback by team members or a review by reviewer, the mapping for Answers table is done by response_id which is a foreign key to response table. Response table gives us map_id which maps to Response Maps table. Now, Response Map table gives us information of the reviewer_id, reviewee_id, reviewed_object_id (which is the id for the assignment being reviewed) and the type (whether it's a teammate review, author feedback or a regular review). We will have to fetch the answers from the Answer table based on response_id because in our case, the response is from a reviewee and not a reviewer. So, we will fetch those answers whose response type is FeedbackResponseMap and calculate scores for those questions from Review_Scores table. &lt;br /&gt;
&lt;br /&gt;
=== Questions Table Structure===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Questions page already exists,so created a page with the name Questions table and gave an external link on the tables page--&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id &lt;br /&gt;
|int(11)  &lt;br /&gt;
|unique identifier for the record&lt;br /&gt;
|- &lt;br /&gt;
!txt   &lt;br /&gt;
|text  &lt;br /&gt;
|the question string&lt;br /&gt;
|- &lt;br /&gt;
!weight   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|specifies the weighting of the question&lt;br /&gt;
|- &lt;br /&gt;
!questionnaire_id   &lt;br /&gt;
|int(11)&lt;br /&gt;
|the id of the questionnaire that this question belongs to&lt;br /&gt;
|-&lt;br /&gt;
!seq&lt;br /&gt;
|DECIMAL&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Type of question&lt;br /&gt;
|-&lt;br /&gt;
!size&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Size of the question&lt;br /&gt;
|-&lt;br /&gt;
!alternatives&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|Other question which means the same&lt;br /&gt;
|-&lt;br /&gt;
!break_before&lt;br /&gt;
|BIT&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!max_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
!min_label&lt;br /&gt;
|VARCHAR(255)&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Answer Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|- &lt;br /&gt;
!id   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Unique ID for each Answers record.&lt;br /&gt;
|- &lt;br /&gt;
!question_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of Question.&lt;br /&gt;
|- &lt;br /&gt;
!answer   &lt;br /&gt;
|int(11)  &lt;br /&gt;
|Value of each of the answer.&lt;br /&gt;
|- &lt;br /&gt;
!comments  &lt;br /&gt;
|text  &lt;br /&gt;
|Comment given to the answer.&lt;br /&gt;
|- &lt;br /&gt;
!reponse_id   &lt;br /&gt;
|int(11) &lt;br /&gt;
|ID of the response associated with this Answer.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Table Structure ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|-&lt;br /&gt;
!map_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The ID of the [[response_maps|response map]] defining the relationship that this response applies to&lt;br /&gt;
|-&lt;br /&gt;
!additional_comment&lt;br /&gt;
|text&lt;br /&gt;
|An additional comment provided by the reviewer to support his/her response&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was last modified&lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|datetime&lt;br /&gt;
|The timestamp indicating when this response was created&lt;br /&gt;
|-&lt;br /&gt;
!version_num&lt;br /&gt;
|int(11)&lt;br /&gt;
|The version of the review.&lt;br /&gt;
|-&lt;br /&gt;
!round&lt;br /&gt;
|int(11)&lt;br /&gt;
|The round the review is connected to. &lt;br /&gt;
|-&lt;br /&gt;
!is_submitted&lt;br /&gt;
|tinyint(1)&lt;br /&gt;
|Boolean Field to indicate whether the review is submitted.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Response Map Table ===&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
!Field Name !!Type !!Description &lt;br /&gt;
|-&lt;br /&gt;
!id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The unique record id&lt;br /&gt;
|- &lt;br /&gt;
!reviewed_object_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The object being reviewed in the [[responses|response]]. Possible objects include other ResponseMaps or [[assignments]]&lt;br /&gt;
|-&lt;br /&gt;
!reviewer_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[participants|participant]] (actually AssignmentParticipant) providing the response&lt;br /&gt;
|-&lt;br /&gt;
!reviewee_id&lt;br /&gt;
|int(11)&lt;br /&gt;
|The [[teams|team]] (AssignmentTeam) receiving the response&lt;br /&gt;
|-&lt;br /&gt;
!type&lt;br /&gt;
|varchar(255)&lt;br /&gt;
|Used for subclassing the response map. Available subclasses are ReviewResponseMap, MetareviewResponseMap, FeedbackResponseMap, TeammateReviewResponseMap  &lt;br /&gt;
|-&lt;br /&gt;
!created_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time for when the record was created&lt;br /&gt;
|-&lt;br /&gt;
!updated_at&lt;br /&gt;
|DATETIME&lt;br /&gt;
|Date and Time when the last update was made&lt;br /&gt;
|-&lt;br /&gt;
!calibrate_to&lt;br /&gt;
|BIT&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== UI Implementation ==&lt;br /&gt;
&lt;br /&gt;
In the page Review report for Design exercise ( login as an instructor -&amp;gt; Manage -&amp;gt; Assignments -&amp;gt; View review report ), we are planning to add one more column to show the average ratings for the authors feedback on a particular assignment. The logic for calculating the average score for the feedback would be similar to already implemented logic for score awarded/ average score column. Below attached shows the page we are planning to edit.&lt;br /&gt;
&lt;br /&gt;
[[File:Feedback_new.png]]&lt;br /&gt;
&lt;br /&gt;
== Code Logic ==&lt;br /&gt;
&lt;br /&gt;
Following shows the code logic we are planning to write for calculating the avg scores for feedback given by authors. &lt;br /&gt;
&lt;br /&gt;
= Test Plan =&lt;br /&gt;
* Plan to test the response report page (/review_mapping/response_report?id={:assignmentID}) to make sure the new field (avg author feedback) exists&lt;br /&gt;
** Using rspec we will add a test case to ReviewMappingControllerSpec.rb&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
    context 'when type is FeedbackResponseMap' do&lt;br /&gt;
      context 'when assignment has varying_rubrics_by_round feature' do&lt;br /&gt;
        it 'renders response_report page with corresponding data' do&lt;br /&gt;
          allow(assignment).to receive(:varying_rubrics_by_round?).and_return(true)&lt;br /&gt;
          allow(FeedbackResponseMap).to receive(:feedback_response_report).with('1', 'FeedbackResponseMap')&lt;br /&gt;
                                                                          .and_return([participant, participant1], [1, 2], [3, 4], [])&lt;br /&gt;
          params = {&lt;br /&gt;
            id: 1,&lt;br /&gt;
            report: {type: 'FeedbackResponseMap'},&lt;br /&gt;
          }&lt;br /&gt;
          get :response_report, params&lt;br /&gt;
          expect(response).to render_template(:response_report)&lt;br /&gt;
          expect(response).to have(:avg_author_feedback)&lt;br /&gt;
        end&lt;br /&gt;
      end&lt;br /&gt;
   end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Also, plan to test the response report page to make sure the new field is alighning well in the UI in the expected place&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
1) http://wiki.expertiza.ncsu.edu/index.php/Documentation_on_Database_Tables&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018&amp;diff=119734</id>
		<title>CSC/ECE 517 Fall 2018</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018&amp;diff=119734"/>
		<updated>2018-11-13T20:01:01Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* [[CSC/ECE 517 Fall 2018- Project E1846. OSS Project Navy: Character Issues]]&lt;br /&gt;
* [[CSC/ECE 517 Fall 2018/OSS E1848 Write unit tests for assignment team.rb]]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/E1839_Review_Requirements_and_Thresholds CSC/ECE 517 Fall 2018 E1839 Review Requirements and Thresholds]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/E1848_Write_unit_tests_for_assignment_team CSC/ECE 517 Fall 2018 E1848 Write unit tests for assignment_team]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/E1835_Refactor_delayed_mailer_and_scheduled_task CSC/ECE 517 Fall 2018 E1835_Refactor_delayed_mailer_and_scheduled_task]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/E1829_OSS_project_Duke_Blue_Fix_import_glitches CSC/ECE 517 Fall 2018 E1829 OSS project Duke Blue: Fix import glitches]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/CSC/ECE_517_Fall_2018/E1853_Write_unit_tests_for_menu.rb CSC/ECE 517 Fall 2018 E1853 Write unit tests for menu.rb]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/E1853_write_unit_tests_for_menu CSC/ECE517 Fall 2018 E1853 Write Unit Tests For menu.rb]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/CSC/ECE_517_Fall_2018_-_Project_E1852_Write_unit_tests_for_participant.rb CSC/ECE 517 Fall 2018 E1852 Write unit tests for participant.rb]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/E1844_Issues_related_to_names CSC/ECE 517 Fall 2018 E1844 Issues related to names]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/User_talk:Rshakya CSC/ECE 517 Fall 2018/E1852 Unit Test for Participant Model]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/E1850_Write_unit_tests_for_review_response_map CSC/ECE 517 Fall 2018 Write unit tests for review-response_map.rb]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/CSC/ECE_517_Fall_2018/E1849_Write_Unit_Tests_for_vm_question_response.rb CSC/ECE 517 Fall 2018/E1849 Write Unit Tests for vm_question_response.rb]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/CSC/ECE_517_Fall_2018/E1866_Expertiza_Internationalization CSC/ECE 517 Fall 2018/E1866 Expertiza Internationalization]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/CSC/ECE_517_Fall_2018/E1879_Student_Generated_Questions_Added_To_Rubric CSC/ECE 517 Fall 2018/E1879 Student Generated Questions Added To Rubric]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/CSC/ECE_517_Fall_2018/E1856_Allow_reviewers_to_bid_on_what_to_review CSC/ECE 517 Fall 2018/E1856 Allow]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/CSC/ECE_517_Fall_2018/E1876_Completion/Progress_view CSC/ECE 517 Fall 2018 E1876_Completion/Progress_view]&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018&amp;diff=118044</id>
		<title>CSC/ECE 517 Fall 2018</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2018&amp;diff=118044"/>
		<updated>2018-11-02T18:30:35Z</updated>

		<summary type="html">&lt;p&gt;Psingh22: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* [[CSC/ECE 517 Fall 2018- Project E1846. OSS Project Navy: Character Issues]]&lt;br /&gt;
* [[CSC/ECE 517 Fall 2018/OSS E1848 Write unit tests for assignment team.rb]]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/E1839_Review_Requirements_and_Thresholds CSC/ECE 517 Fall 2018 E1839 Review Requirements and Thresholds]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/E1848_Write_unit_tests_for_assignment_team CSC/ECE 517 Fall 2018 E1848 Write unit tests for assignment_team]&lt;br /&gt;
* [http://wiki.expertiza.ncsu.edu/index.php/E1835_Refactor_delayed_mailer_and_scheduled_task CSC/ECE 517 Fall 2018 E1835_Refactor_delayed_mailer_and_scheduled_task]&lt;/div&gt;</summary>
		<author><name>Psingh22</name></author>
	</entry>
</feed>