<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.expertiza.ncsu.edu/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Proy4</id>
	<title>Expertiza_Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.expertiza.ncsu.edu/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Proy4"/>
	<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=Special:Contributions/Proy4"/>
	<updated>2026-05-15T04:53:49Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.41.0</generator>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Uml_g_b_1.jpg&amp;diff=113778</id>
		<title>File:Uml g b 1.jpg</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Uml_g_b_1.jpg&amp;diff=113778"/>
		<updated>2017-11-28T00:47:38Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113777</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113777"/>
		<updated>2017-11-28T00:47:19Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
&lt;br /&gt;
== UML Diagram ==&lt;br /&gt;
&amp;lt;br&amp;gt;[[File:uml_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_4.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Test Plan ==&lt;br /&gt;
===Issue 1:Check if color coding from red to green for a range of score===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 2:Check if different graph visible for different review submission===&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
===Issue 3:Check if the table is sorted with appropriate column alphabetically===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 4:Check if graph in Issue 3 shows how a class performed on selected rubric criteria===&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza repo]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://api.highcharts.com/highcharts Highcharts API]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113776</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113776"/>
		<updated>2017-11-28T00:33:37Z</updated>

		<summary type="html">&lt;p&gt;Proy4: /* Solutions Implemented */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
&lt;br /&gt;
== UML Diagram ==&lt;br /&gt;
&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_4.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Test Plan ==&lt;br /&gt;
===Issue 1:Check if color coding from red to green for a range of score===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 2:Check if different graph visible for different review submission===&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
===Issue 3:Check if the table is sorted with appropriate column alphabetically===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 4:Check if graph in Issue 3 shows how a class performed on selected rubric criteria===&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza repo]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://api.highcharts.com/highcharts Highcharts API]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113775</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113775"/>
		<updated>2017-11-28T00:32:40Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
&lt;br /&gt;
== UML Diagram ==&lt;br /&gt;
&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_4.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Test Plan ==&lt;br /&gt;
===Issue 1:Check if color coding from red to green for a range of score===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 2:Check if different graph visible for different review submission===&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
===Issue 3:Check if the table is sorted with appropriate column alphabetically===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 4:Check if graph in Issue 3 shows how a class performed on selected rubric criteria===&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza repo]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://api.highcharts.com/highcharts Highcharts API]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113774</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113774"/>
		<updated>2017-11-28T00:29:56Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
&lt;br /&gt;
== UML Diagram ==&lt;br /&gt;
&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_4.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Test Plan ==&lt;br /&gt;
===Issue 1:Check the color coding when different number is given in different scale.===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 2:Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.===&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
===Issue 3:Check if the table is sorted with appropriate column alphabetically===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 4:An interactive visualization or table that shows how a class performed on selected rubric criteria===&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza repo]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://api.highcharts.com/highcharts Highcharts API]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113773</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113773"/>
		<updated>2017-11-28T00:24:21Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_4.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Test Plan ==&lt;br /&gt;
===Issue 1:Check the color coding when different number is given in different scale.===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 2:Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.===&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
===Issue 3:Check if the table is sorted with appropriate column alphabetically===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 4:An interactive visualization or table that shows how a class performed on selected rubric criteria===&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza repo]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://api.highcharts.com/highcharts Highcharts API]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113772</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113772"/>
		<updated>2017-11-28T00:22:26Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_4.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Test Plan ==&lt;br /&gt;
===Issue 1:Check the color coding when different number is given in different scale.===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 2:Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.===&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
===Issue 3:Check if the table is sorted with appropriate column alphabetically===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Issue 4:An interactive visualization or table that shows how a class performed on selected rubric criteria===&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza repo]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://api.highcharts.com/highcharts Highcharts API]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113771</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113771"/>
		<updated>2017-11-28T00:22:03Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_4.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Test Plan ==&lt;br /&gt;
==Issue 1:Check the color coding when different number is given in different scale.===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
===Issue 2:Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.===&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
===Issue 3:Check if the table is sorted with appropriate column alphabetically===&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Issue 4:An interactive visualization or table that shows how a class performed on selected rubric criteria===&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza repo]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://api.highcharts.com/highcharts Highcharts API]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113770</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113770"/>
		<updated>2017-11-28T00:20:55Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_4.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Issue 4:An interactive visualization or table that shows how a class performed on selected rubric criteria===&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza repo]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://api.highcharts.com/highcharts Highcharts API]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113769</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113769"/>
		<updated>2017-11-28T00:18:28Z</updated>

		<summary type="html">&lt;p&gt;Proy4: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_4.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza repo]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://api.highcharts.com/highcharts Highcharts API]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113768</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113768"/>
		<updated>2017-11-28T00:17:46Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_4.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : [https://github.com/expertiza/expertiza]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://api.highcharts.com/highcharts]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Graph_g_b_4.jpg&amp;diff=113767</id>
		<title>File:Graph g b 4.jpg</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Graph_g_b_4.jpg&amp;diff=113767"/>
		<updated>2017-11-28T00:13:53Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113766</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113766"/>
		<updated>2017-11-28T00:13:36Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_4.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Graph_g_b_3.jpg&amp;diff=113765</id>
		<title>File:Graph g b 3.jpg</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Graph_g_b_3.jpg&amp;diff=113765"/>
		<updated>2017-11-28T00:12:25Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113764</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113764"/>
		<updated>2017-11-28T00:12:10Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach:''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.Scores will be color coded for each of the rubrics in each of the submissions. When hovered over the graph , for each score, the instructor should see the number of students , the percentage of students that has scored that particular point in that rubric in that submission.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_3.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Graph_g_b_2.jpg&amp;diff=113763</id>
		<title>File:Graph g b 2.jpg</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Graph_g_b_2.jpg&amp;diff=113763"/>
		<updated>2017-11-28T00:05:50Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113762</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113762"/>
		<updated>2017-11-28T00:05:34Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_2.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach to solve issue 3: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Graph_g_b_1.jpg&amp;diff=113761</id>
		<title>File:Graph g b 1.jpg</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Graph_g_b_1.jpg&amp;diff=113761"/>
		<updated>2017-11-28T00:04:09Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113760</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113760"/>
		<updated>2017-11-28T00:03:50Z</updated>

		<summary type="html">&lt;p&gt;Proy4: /* Solutions Implemented */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== &lt;br /&gt;
'''Description:'''The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Issue 2=== &lt;br /&gt;
'''Description:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
===Issue 3=== &lt;br /&gt;
'''Description''': The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach to solve issue 3: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
===Issue 4===&lt;br /&gt;
'''Description:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113759</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113759"/>
		<updated>2017-11-28T00:01:47Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
===Issue 1=== :The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b_1.jpg]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
Issue 3: The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach to solve issue 3: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113758</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113758"/>
		<updated>2017-11-28T00:00:08Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
'''Issue 1:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related screenshot&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:graph_g_b.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
Issue 3: The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach to solve issue 3: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113752</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113752"/>
		<updated>2017-11-27T23:03:37Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
'''Issue 1:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
Issue 3: The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
Approach to solve issue 3: Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb&lt;br /&gt;
 &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113750</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113750"/>
		<updated>2017-11-27T22:46:41Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* &amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== '''Approach to be followed to fix the issues''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hover over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb&lt;br /&gt;
 &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 6:''' Integration of review performance- The basic aim of this implementation is to somehow combine an author’s feedback on a review and the corresponding review of a reviewer, to build a method for grading the reviewers.&lt;br /&gt;
Basic rubrics to be considered:&amp;lt;br&amp;gt;&lt;br /&gt;
-Number of reviews completed&amp;lt;br&amp;gt;&lt;br /&gt;
-Length of reviews&amp;lt;br&amp;gt;&lt;br /&gt;
-Summary of reviews&amp;lt;br&amp;gt;&lt;br /&gt;
-Whether reviewers added a file or link to their review &amp;lt;br&amp;gt;&lt;br /&gt;
-The average ratings they received from the author.&amp;lt;br&amp;gt;&lt;br /&gt;
-A graph or table is prefered for easing this change.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 6:''' The entire focus is on to what extent a review could help the author. The latter’s feedback is always available in the form of numerical scores. We intend to capture the author’s score of the review topic-wise like that of tone of review, plausible solutions to problems suggested by reviewer and the rate of how much it helped the author to improve(scored by himself). &amp;lt;br&amp;gt;&lt;br /&gt;
-In each round of review done, the author’s feedback is noted topic-wise as suggested in the previous step and ''possibly generate a graph comparing the “n” rounds of reviews''.&amp;lt;br&amp;gt;&lt;br /&gt;
-If reviewer added ''a legit file or link to their review'', then we propose to add a few extra credits to the reviewer that can be added to their final grade.&lt;br /&gt;
The author’s feedback in the form of an average number can be taken and if that value exceeds that of a threshold , then the reviews were really meaningful.&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For example - &lt;br /&gt;
Consider author’s feedback as follows for 2 rounds of reviews:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:ExampleTableForAuthorFeedback.PNG]] &amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
		&lt;br /&gt;
Observations: Mean score of round 1 is 3 and Mean score of round 2 is 1.9. &amp;lt;br&amp;gt;&lt;br /&gt;
Suppose , we take into account that Round of 1 has more importance than the second one because it helps a reviewer improve, and also that every reviewer should atleast provide 2 suggestions or comments that help authors improve. Then the threshold value is considered to be 2 and the Mean score of round 1 &amp;gt; threshold. &amp;lt;br&amp;gt;&lt;br /&gt;
Therefore, the reviews were meaningful and deserves a credit in the higher range of grades. If the mean score of round 1 &amp;lt; threshold then the mean score of round 2 is compared with the threshold.  &amp;lt;br&amp;gt;&lt;br /&gt;
		''Overall : Mean round 1&amp;gt; threshold : Higher range of grade to reviewer.&amp;lt;br&amp;gt;&lt;br /&gt;
			  Mean round 1&amp;lt;threshold  : Check if Mean round 2 &amp;gt; threshold &amp;lt;br&amp;gt;&lt;br /&gt;
							If true , Medium range of grade to reviewer.&amp;lt;br&amp;gt;&lt;br /&gt;
							Else , lower range of grades.&amp;lt;br&amp;gt;''&lt;br /&gt;
Now once it’s decided in what range a reviewer deserves credit, The other factors like ''summary of review'' and ''length of review'' are taken into account by the instructors and graded according to the effectiveness of concepts explained in it. Also if a reviewer misses the second round of review then only the first round is taken into consideration. ''The instructor should believe that all but the last round of reviews are crucial to improve a document.''&amp;lt;br&amp;gt;&lt;br /&gt;
'''Graphs:'''Can be generated for comparing the effectiveness of the “n” rounds of reviews for the authors depending on the author’s score. &amp;lt;br&amp;gt;&lt;br /&gt;
  &lt;br /&gt;
In this case - (only an example visualization)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:ExampleGraph.PNG]] &amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Observations: The graph and table together suggest that round 1 of the review actually had an impact on the author.&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Go to View Scores page. Hover over a cell.&lt;br /&gt;
*3) You will see a link being displayed upon hovering. Click on the cell.&lt;br /&gt;
*4) If the system directs you to the link that was displayed upon hovering, then the test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 6:''' Since it is a new algorithm being proposed, we intend to test it first manually using dummy datasets. Later on, automated test cases can be added for the same.&lt;br /&gt;
*1) Login as an instructor and create a dummy assignment with 2 rounds of review submissions. Assign 2 teams to that assignment of 1 student each. Also, put the deadline of first review as &amp;quot;1 hour&amp;quot;. Log out&lt;br /&gt;
*2) Login as a student and submit some data for the assignment. Log out.&lt;br /&gt;
*3) Login as the second student and submit the same assignment using dummy data. Log out.&lt;br /&gt;
All the above steps will help us reach towards the section of reviews.&lt;br /&gt;
*4) Login as the first student and go to &amp;quot;Other's work&amp;quot; and request for a review document.By default, you will be assigned the document of the second student to review. Once assigned, complete that review with dummy scores. Log out.&lt;br /&gt;
*5) Login as the second student. Check the review in &amp;quot;Your scores&amp;quot; page and click on the section &amp;quot;Show review&amp;quot;. There you can provide scored feedback to the reviewer. Definitely take into account the tone, solutions suggested and how it will help you. Log out.&lt;br /&gt;
*6) Login as the first student after the deadline for the first review completes. Go to &amp;quot;Other's work&amp;quot; and request for a new submission. You will be assigned the same document of the other student to review for the second round. Follow step 4.&lt;br /&gt;
*7) Login as the second student and follow step 5.&lt;br /&gt;
*8) The system has the data related to author's feedback and the reviewer's data and will calculate the scores of the reviewer according to the ''range of credits'' ( lower range, medium range and higher range) set by the instructor.&lt;br /&gt;
&lt;br /&gt;
*Edge cases - May include options like ''adding a link in the review document by the reviewer'' and seeing how the grade is increased.&lt;br /&gt;
May also take into account the situation when author's feedback isn't available and the score of the reviewer would solely depend on the latter's reviews adn the instructor's jurisdiction because the algorithm in this case would fail.&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113749</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113749"/>
		<updated>2017-11-27T22:45:29Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
:* Issue 1: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
:* Issue 2: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
:*Issue 3:The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
:*Issue 4: An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== '''Approach to be followed to fix the issues''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hover over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb&lt;br /&gt;
 &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 6:''' Integration of review performance- The basic aim of this implementation is to somehow combine an author’s feedback on a review and the corresponding review of a reviewer, to build a method for grading the reviewers.&lt;br /&gt;
Basic rubrics to be considered:&amp;lt;br&amp;gt;&lt;br /&gt;
-Number of reviews completed&amp;lt;br&amp;gt;&lt;br /&gt;
-Length of reviews&amp;lt;br&amp;gt;&lt;br /&gt;
-Summary of reviews&amp;lt;br&amp;gt;&lt;br /&gt;
-Whether reviewers added a file or link to their review &amp;lt;br&amp;gt;&lt;br /&gt;
-The average ratings they received from the author.&amp;lt;br&amp;gt;&lt;br /&gt;
-A graph or table is prefered for easing this change.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 6:''' The entire focus is on to what extent a review could help the author. The latter’s feedback is always available in the form of numerical scores. We intend to capture the author’s score of the review topic-wise like that of tone of review, plausible solutions to problems suggested by reviewer and the rate of how much it helped the author to improve(scored by himself). &amp;lt;br&amp;gt;&lt;br /&gt;
-In each round of review done, the author’s feedback is noted topic-wise as suggested in the previous step and ''possibly generate a graph comparing the “n” rounds of reviews''.&amp;lt;br&amp;gt;&lt;br /&gt;
-If reviewer added ''a legit file or link to their review'', then we propose to add a few extra credits to the reviewer that can be added to their final grade.&lt;br /&gt;
The author’s feedback in the form of an average number can be taken and if that value exceeds that of a threshold , then the reviews were really meaningful.&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For example - &lt;br /&gt;
Consider author’s feedback as follows for 2 rounds of reviews:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:ExampleTableForAuthorFeedback.PNG]] &amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
		&lt;br /&gt;
Observations: Mean score of round 1 is 3 and Mean score of round 2 is 1.9. &amp;lt;br&amp;gt;&lt;br /&gt;
Suppose , we take into account that Round of 1 has more importance than the second one because it helps a reviewer improve, and also that every reviewer should atleast provide 2 suggestions or comments that help authors improve. Then the threshold value is considered to be 2 and the Mean score of round 1 &amp;gt; threshold. &amp;lt;br&amp;gt;&lt;br /&gt;
Therefore, the reviews were meaningful and deserves a credit in the higher range of grades. If the mean score of round 1 &amp;lt; threshold then the mean score of round 2 is compared with the threshold.  &amp;lt;br&amp;gt;&lt;br /&gt;
		''Overall : Mean round 1&amp;gt; threshold : Higher range of grade to reviewer.&amp;lt;br&amp;gt;&lt;br /&gt;
			  Mean round 1&amp;lt;threshold  : Check if Mean round 2 &amp;gt; threshold &amp;lt;br&amp;gt;&lt;br /&gt;
							If true , Medium range of grade to reviewer.&amp;lt;br&amp;gt;&lt;br /&gt;
							Else , lower range of grades.&amp;lt;br&amp;gt;''&lt;br /&gt;
Now once it’s decided in what range a reviewer deserves credit, The other factors like ''summary of review'' and ''length of review'' are taken into account by the instructors and graded according to the effectiveness of concepts explained in it. Also if a reviewer misses the second round of review then only the first round is taken into consideration. ''The instructor should believe that all but the last round of reviews are crucial to improve a document.''&amp;lt;br&amp;gt;&lt;br /&gt;
'''Graphs:'''Can be generated for comparing the effectiveness of the “n” rounds of reviews for the authors depending on the author’s score. &amp;lt;br&amp;gt;&lt;br /&gt;
  &lt;br /&gt;
In this case - (only an example visualization)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:ExampleGraph.PNG]] &amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Observations: The graph and table together suggest that round 1 of the review actually had an impact on the author.&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Go to View Scores page. Hover over a cell.&lt;br /&gt;
*3) You will see a link being displayed upon hovering. Click on the cell.&lt;br /&gt;
*4) If the system directs you to the link that was displayed upon hovering, then the test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 6:''' Since it is a new algorithm being proposed, we intend to test it first manually using dummy datasets. Later on, automated test cases can be added for the same.&lt;br /&gt;
*1) Login as an instructor and create a dummy assignment with 2 rounds of review submissions. Assign 2 teams to that assignment of 1 student each. Also, put the deadline of first review as &amp;quot;1 hour&amp;quot;. Log out&lt;br /&gt;
*2) Login as a student and submit some data for the assignment. Log out.&lt;br /&gt;
*3) Login as the second student and submit the same assignment using dummy data. Log out.&lt;br /&gt;
All the above steps will help us reach towards the section of reviews.&lt;br /&gt;
*4) Login as the first student and go to &amp;quot;Other's work&amp;quot; and request for a review document.By default, you will be assigned the document of the second student to review. Once assigned, complete that review with dummy scores. Log out.&lt;br /&gt;
*5) Login as the second student. Check the review in &amp;quot;Your scores&amp;quot; page and click on the section &amp;quot;Show review&amp;quot;. There you can provide scored feedback to the reviewer. Definitely take into account the tone, solutions suggested and how it will help you. Log out.&lt;br /&gt;
*6) Login as the first student after the deadline for the first review completes. Go to &amp;quot;Other's work&amp;quot; and request for a new submission. You will be assigned the same document of the other student to review for the second round. Follow step 4.&lt;br /&gt;
*7) Login as the second student and follow step 5.&lt;br /&gt;
*8) The system has the data related to author's feedback and the reviewer's data and will calculate the scores of the reviewer according to the ''range of credits'' ( lower range, medium range and higher range) set by the instructor.&lt;br /&gt;
&lt;br /&gt;
*Edge cases - May include options like ''adding a link in the review document by the reviewer'' and seeing how the grade is increased.&lt;br /&gt;
May also take into account the situation when author's feedback isn't available and the score of the reviewer would solely depend on the latter's reviews adn the instructor's jurisdiction because the algorithm in this case would fail.&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113748</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113748"/>
		<updated>2017-11-27T22:40:50Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
===&amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;===&lt;br /&gt;
: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
 	&lt;br /&gt;
===&amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;===&lt;br /&gt;
: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
 	&lt;br /&gt;
===&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;===&lt;br /&gt;
: The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
 	&lt;br /&gt;
===&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;===&lt;br /&gt;
:An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== '''Approach to be followed to fix the issues''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hover over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb&lt;br /&gt;
 &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 6:''' Integration of review performance- The basic aim of this implementation is to somehow combine an author’s feedback on a review and the corresponding review of a reviewer, to build a method for grading the reviewers.&lt;br /&gt;
Basic rubrics to be considered:&amp;lt;br&amp;gt;&lt;br /&gt;
-Number of reviews completed&amp;lt;br&amp;gt;&lt;br /&gt;
-Length of reviews&amp;lt;br&amp;gt;&lt;br /&gt;
-Summary of reviews&amp;lt;br&amp;gt;&lt;br /&gt;
-Whether reviewers added a file or link to their review &amp;lt;br&amp;gt;&lt;br /&gt;
-The average ratings they received from the author.&amp;lt;br&amp;gt;&lt;br /&gt;
-A graph or table is prefered for easing this change.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 6:''' The entire focus is on to what extent a review could help the author. The latter’s feedback is always available in the form of numerical scores. We intend to capture the author’s score of the review topic-wise like that of tone of review, plausible solutions to problems suggested by reviewer and the rate of how much it helped the author to improve(scored by himself). &amp;lt;br&amp;gt;&lt;br /&gt;
-In each round of review done, the author’s feedback is noted topic-wise as suggested in the previous step and ''possibly generate a graph comparing the “n” rounds of reviews''.&amp;lt;br&amp;gt;&lt;br /&gt;
-If reviewer added ''a legit file or link to their review'', then we propose to add a few extra credits to the reviewer that can be added to their final grade.&lt;br /&gt;
The author’s feedback in the form of an average number can be taken and if that value exceeds that of a threshold , then the reviews were really meaningful.&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For example - &lt;br /&gt;
Consider author’s feedback as follows for 2 rounds of reviews:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:ExampleTableForAuthorFeedback.PNG]] &amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
		&lt;br /&gt;
Observations: Mean score of round 1 is 3 and Mean score of round 2 is 1.9. &amp;lt;br&amp;gt;&lt;br /&gt;
Suppose , we take into account that Round of 1 has more importance than the second one because it helps a reviewer improve, and also that every reviewer should atleast provide 2 suggestions or comments that help authors improve. Then the threshold value is considered to be 2 and the Mean score of round 1 &amp;gt; threshold. &amp;lt;br&amp;gt;&lt;br /&gt;
Therefore, the reviews were meaningful and deserves a credit in the higher range of grades. If the mean score of round 1 &amp;lt; threshold then the mean score of round 2 is compared with the threshold.  &amp;lt;br&amp;gt;&lt;br /&gt;
		''Overall : Mean round 1&amp;gt; threshold : Higher range of grade to reviewer.&amp;lt;br&amp;gt;&lt;br /&gt;
			  Mean round 1&amp;lt;threshold  : Check if Mean round 2 &amp;gt; threshold &amp;lt;br&amp;gt;&lt;br /&gt;
							If true , Medium range of grade to reviewer.&amp;lt;br&amp;gt;&lt;br /&gt;
							Else , lower range of grades.&amp;lt;br&amp;gt;''&lt;br /&gt;
Now once it’s decided in what range a reviewer deserves credit, The other factors like ''summary of review'' and ''length of review'' are taken into account by the instructors and graded according to the effectiveness of concepts explained in it. Also if a reviewer misses the second round of review then only the first round is taken into consideration. ''The instructor should believe that all but the last round of reviews are crucial to improve a document.''&amp;lt;br&amp;gt;&lt;br /&gt;
'''Graphs:'''Can be generated for comparing the effectiveness of the “n” rounds of reviews for the authors depending on the author’s score. &amp;lt;br&amp;gt;&lt;br /&gt;
  &lt;br /&gt;
In this case - (only an example visualization)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:ExampleGraph.PNG]] &amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Observations: The graph and table together suggest that round 1 of the review actually had an impact on the author.&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Go to View Scores page. Hover over a cell.&lt;br /&gt;
*3) You will see a link being displayed upon hovering. Click on the cell.&lt;br /&gt;
*4) If the system directs you to the link that was displayed upon hovering, then the test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 6:''' Since it is a new algorithm being proposed, we intend to test it first manually using dummy datasets. Later on, automated test cases can be added for the same.&lt;br /&gt;
*1) Login as an instructor and create a dummy assignment with 2 rounds of review submissions. Assign 2 teams to that assignment of 1 student each. Also, put the deadline of first review as &amp;quot;1 hour&amp;quot;. Log out&lt;br /&gt;
*2) Login as a student and submit some data for the assignment. Log out.&lt;br /&gt;
*3) Login as the second student and submit the same assignment using dummy data. Log out.&lt;br /&gt;
All the above steps will help us reach towards the section of reviews.&lt;br /&gt;
*4) Login as the first student and go to &amp;quot;Other's work&amp;quot; and request for a review document.By default, you will be assigned the document of the second student to review. Once assigned, complete that review with dummy scores. Log out.&lt;br /&gt;
*5) Login as the second student. Check the review in &amp;quot;Your scores&amp;quot; page and click on the section &amp;quot;Show review&amp;quot;. There you can provide scored feedback to the reviewer. Definitely take into account the tone, solutions suggested and how it will help you. Log out.&lt;br /&gt;
*6) Login as the first student after the deadline for the first review completes. Go to &amp;quot;Other's work&amp;quot; and request for a new submission. You will be assigned the same document of the other student to review for the second round. Follow step 4.&lt;br /&gt;
*7) Login as the second student and follow step 5.&lt;br /&gt;
*8) The system has the data related to author's feedback and the reviewer's data and will calculate the scores of the reviewer according to the ''range of credits'' ( lower range, medium range and higher range) set by the instructor.&lt;br /&gt;
&lt;br /&gt;
*Edge cases - May include options like ''adding a link in the review document by the reviewer'' and seeing how the grade is increased.&lt;br /&gt;
May also take into account the situation when author's feedback isn't available and the score of the reviewer would solely depend on the latter's reviews adn the instructor's jurisdiction because the algorithm in this case would fail.&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113747</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113747"/>
		<updated>2017-11-27T22:38:42Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
===&amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;===: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
 	&lt;br /&gt;
===&amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;===: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
 	&lt;br /&gt;
===&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;===: The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
 	&lt;br /&gt;
===&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;===:An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== '''Approach to be followed to fix the issues''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hover over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb&lt;br /&gt;
 &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 6:''' Integration of review performance- The basic aim of this implementation is to somehow combine an author’s feedback on a review and the corresponding review of a reviewer, to build a method for grading the reviewers.&lt;br /&gt;
Basic rubrics to be considered:&amp;lt;br&amp;gt;&lt;br /&gt;
-Number of reviews completed&amp;lt;br&amp;gt;&lt;br /&gt;
-Length of reviews&amp;lt;br&amp;gt;&lt;br /&gt;
-Summary of reviews&amp;lt;br&amp;gt;&lt;br /&gt;
-Whether reviewers added a file or link to their review &amp;lt;br&amp;gt;&lt;br /&gt;
-The average ratings they received from the author.&amp;lt;br&amp;gt;&lt;br /&gt;
-A graph or table is prefered for easing this change.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 6:''' The entire focus is on to what extent a review could help the author. The latter’s feedback is always available in the form of numerical scores. We intend to capture the author’s score of the review topic-wise like that of tone of review, plausible solutions to problems suggested by reviewer and the rate of how much it helped the author to improve(scored by himself). &amp;lt;br&amp;gt;&lt;br /&gt;
-In each round of review done, the author’s feedback is noted topic-wise as suggested in the previous step and ''possibly generate a graph comparing the “n” rounds of reviews''.&amp;lt;br&amp;gt;&lt;br /&gt;
-If reviewer added ''a legit file or link to their review'', then we propose to add a few extra credits to the reviewer that can be added to their final grade.&lt;br /&gt;
The author’s feedback in the form of an average number can be taken and if that value exceeds that of a threshold , then the reviews were really meaningful.&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For example - &lt;br /&gt;
Consider author’s feedback as follows for 2 rounds of reviews:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:ExampleTableForAuthorFeedback.PNG]] &amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
		&lt;br /&gt;
Observations: Mean score of round 1 is 3 and Mean score of round 2 is 1.9. &amp;lt;br&amp;gt;&lt;br /&gt;
Suppose , we take into account that Round of 1 has more importance than the second one because it helps a reviewer improve, and also that every reviewer should atleast provide 2 suggestions or comments that help authors improve. Then the threshold value is considered to be 2 and the Mean score of round 1 &amp;gt; threshold. &amp;lt;br&amp;gt;&lt;br /&gt;
Therefore, the reviews were meaningful and deserves a credit in the higher range of grades. If the mean score of round 1 &amp;lt; threshold then the mean score of round 2 is compared with the threshold.  &amp;lt;br&amp;gt;&lt;br /&gt;
		''Overall : Mean round 1&amp;gt; threshold : Higher range of grade to reviewer.&amp;lt;br&amp;gt;&lt;br /&gt;
			  Mean round 1&amp;lt;threshold  : Check if Mean round 2 &amp;gt; threshold &amp;lt;br&amp;gt;&lt;br /&gt;
							If true , Medium range of grade to reviewer.&amp;lt;br&amp;gt;&lt;br /&gt;
							Else , lower range of grades.&amp;lt;br&amp;gt;''&lt;br /&gt;
Now once it’s decided in what range a reviewer deserves credit, The other factors like ''summary of review'' and ''length of review'' are taken into account by the instructors and graded according to the effectiveness of concepts explained in it. Also if a reviewer misses the second round of review then only the first round is taken into consideration. ''The instructor should believe that all but the last round of reviews are crucial to improve a document.''&amp;lt;br&amp;gt;&lt;br /&gt;
'''Graphs:'''Can be generated for comparing the effectiveness of the “n” rounds of reviews for the authors depending on the author’s score. &amp;lt;br&amp;gt;&lt;br /&gt;
  &lt;br /&gt;
In this case - (only an example visualization)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:ExampleGraph.PNG]] &amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Observations: The graph and table together suggest that round 1 of the review actually had an impact on the author.&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Go to View Scores page. Hover over a cell.&lt;br /&gt;
*3) You will see a link being displayed upon hovering. Click on the cell.&lt;br /&gt;
*4) If the system directs you to the link that was displayed upon hovering, then the test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 6:''' Since it is a new algorithm being proposed, we intend to test it first manually using dummy datasets. Later on, automated test cases can be added for the same.&lt;br /&gt;
*1) Login as an instructor and create a dummy assignment with 2 rounds of review submissions. Assign 2 teams to that assignment of 1 student each. Also, put the deadline of first review as &amp;quot;1 hour&amp;quot;. Log out&lt;br /&gt;
*2) Login as a student and submit some data for the assignment. Log out.&lt;br /&gt;
*3) Login as the second student and submit the same assignment using dummy data. Log out.&lt;br /&gt;
All the above steps will help us reach towards the section of reviews.&lt;br /&gt;
*4) Login as the first student and go to &amp;quot;Other's work&amp;quot; and request for a review document.By default, you will be assigned the document of the second student to review. Once assigned, complete that review with dummy scores. Log out.&lt;br /&gt;
*5) Login as the second student. Check the review in &amp;quot;Your scores&amp;quot; page and click on the section &amp;quot;Show review&amp;quot;. There you can provide scored feedback to the reviewer. Definitely take into account the tone, solutions suggested and how it will help you. Log out.&lt;br /&gt;
*6) Login as the first student after the deadline for the first review completes. Go to &amp;quot;Other's work&amp;quot; and request for a new submission. You will be assigned the same document of the other student to review for the second round. Follow step 4.&lt;br /&gt;
*7) Login as the second student and follow step 5.&lt;br /&gt;
*8) The system has the data related to author's feedback and the reviewer's data and will calculate the scores of the reviewer according to the ''range of credits'' ( lower range, medium range and higher range) set by the instructor.&lt;br /&gt;
&lt;br /&gt;
*Edge cases - May include options like ''adding a link in the review document by the reviewer'' and seeing how the grade is increased.&lt;br /&gt;
May also take into account the situation when author's feedback isn't available and the score of the reviewer would solely depend on the latter's reviews adn the instructor's jurisdiction because the algorithm in this case would fail.&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113746</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=113746"/>
		<updated>2017-11-27T22:35:05Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1792 OSS Visualizations for instructors.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Instructors can also use Expertiza for interactive views of class performances and reviews.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
This project aims to improve the visualizations of certain pages related to reviews and feedback in Expertiza in the instructor's view. This would aid the instructors to judge outcomes of reviews and class performance in assignments via graphs and tables, which in turn would ease the process of grading the reviews.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
====&amp;lt;b&amp;gt;Issue 1&amp;lt;/b&amp;gt;====: The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
 	&lt;br /&gt;
====&amp;lt;b&amp;gt;Issue 2&amp;lt;/b&amp;gt;====: Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
 	&lt;br /&gt;
====&amp;lt;b&amp;gt;Issue 3&amp;lt;/b&amp;gt;====: The table is presorted by teams in the page View Scores, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
 	&lt;br /&gt;
====&amp;lt;b&amp;gt;Issue 4&amp;lt;/b&amp;gt;====:An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show the instructors what they need to focus more attention on.&lt;br /&gt;
 	&lt;br /&gt;
== '''Approach to be followed to fix the issues''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hover over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb&lt;br /&gt;
 &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 6:''' Integration of review performance- The basic aim of this implementation is to somehow combine an author’s feedback on a review and the corresponding review of a reviewer, to build a method for grading the reviewers.&lt;br /&gt;
Basic rubrics to be considered:&amp;lt;br&amp;gt;&lt;br /&gt;
-Number of reviews completed&amp;lt;br&amp;gt;&lt;br /&gt;
-Length of reviews&amp;lt;br&amp;gt;&lt;br /&gt;
-Summary of reviews&amp;lt;br&amp;gt;&lt;br /&gt;
-Whether reviewers added a file or link to their review &amp;lt;br&amp;gt;&lt;br /&gt;
-The average ratings they received from the author.&amp;lt;br&amp;gt;&lt;br /&gt;
-A graph or table is prefered for easing this change.&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 6:''' The entire focus is on to what extent a review could help the author. The latter’s feedback is always available in the form of numerical scores. We intend to capture the author’s score of the review topic-wise like that of tone of review, plausible solutions to problems suggested by reviewer and the rate of how much it helped the author to improve(scored by himself). &amp;lt;br&amp;gt;&lt;br /&gt;
-In each round of review done, the author’s feedback is noted topic-wise as suggested in the previous step and ''possibly generate a graph comparing the “n” rounds of reviews''.&amp;lt;br&amp;gt;&lt;br /&gt;
-If reviewer added ''a legit file or link to their review'', then we propose to add a few extra credits to the reviewer that can be added to their final grade.&lt;br /&gt;
The author’s feedback in the form of an average number can be taken and if that value exceeds that of a threshold , then the reviews were really meaningful.&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For example - &lt;br /&gt;
Consider author’s feedback as follows for 2 rounds of reviews:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:ExampleTableForAuthorFeedback.PNG]] &amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
		&lt;br /&gt;
Observations: Mean score of round 1 is 3 and Mean score of round 2 is 1.9. &amp;lt;br&amp;gt;&lt;br /&gt;
Suppose , we take into account that Round of 1 has more importance than the second one because it helps a reviewer improve, and also that every reviewer should atleast provide 2 suggestions or comments that help authors improve. Then the threshold value is considered to be 2 and the Mean score of round 1 &amp;gt; threshold. &amp;lt;br&amp;gt;&lt;br /&gt;
Therefore, the reviews were meaningful and deserves a credit in the higher range of grades. If the mean score of round 1 &amp;lt; threshold then the mean score of round 2 is compared with the threshold.  &amp;lt;br&amp;gt;&lt;br /&gt;
		''Overall : Mean round 1&amp;gt; threshold : Higher range of grade to reviewer.&amp;lt;br&amp;gt;&lt;br /&gt;
			  Mean round 1&amp;lt;threshold  : Check if Mean round 2 &amp;gt; threshold &amp;lt;br&amp;gt;&lt;br /&gt;
							If true , Medium range of grade to reviewer.&amp;lt;br&amp;gt;&lt;br /&gt;
							Else , lower range of grades.&amp;lt;br&amp;gt;''&lt;br /&gt;
Now once it’s decided in what range a reviewer deserves credit, The other factors like ''summary of review'' and ''length of review'' are taken into account by the instructors and graded according to the effectiveness of concepts explained in it. Also if a reviewer misses the second round of review then only the first round is taken into consideration. ''The instructor should believe that all but the last round of reviews are crucial to improve a document.''&amp;lt;br&amp;gt;&lt;br /&gt;
'''Graphs:'''Can be generated for comparing the effectiveness of the “n” rounds of reviews for the authors depending on the author’s score. &amp;lt;br&amp;gt;&lt;br /&gt;
  &lt;br /&gt;
In this case - (only an example visualization)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:ExampleGraph.PNG]] &amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Observations: The graph and table together suggest that round 1 of the review actually had an impact on the author.&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Test Plan''' ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Go to View Scores page. Hover over a cell.&lt;br /&gt;
*3) You will see a link being displayed upon hovering. Click on the cell.&lt;br /&gt;
*4) If the system directs you to the link that was displayed upon hovering, then the test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment and come dummy reviews for the same.Log out.&lt;br /&gt;
*3) Login as a student. Attempt the assignment. Log out.&lt;br /&gt;
*4) Login as another student and repeat step 3.&lt;br /&gt;
*5) Login as either student and attempt the review. Logout.&lt;br /&gt;
*6) Login as instructor. Go to Review grades page and check the table. If color code ranges from red ( for least score) to green ( for highest score), then test passed.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
*1) Login as instructor.&lt;br /&gt;
*2) Create an assignment. Select 2 reviews. Also select different rubrics for both reviews.&lt;br /&gt;
*3) Login as a student (&amp;quot;A&amp;quot;) and submit the assignment. Repeat this for another student(&amp;quot;B&amp;quot;).&lt;br /&gt;
*4) Login as student A and perform review. Do this for student B too.&lt;br /&gt;
*5) Now resubmit assignment as student A and B again.&lt;br /&gt;
*6) Resubmit reviews as student A and B again. This time the rubrics will be different from the previous round.&lt;br /&gt;
*7) Now login as instructor and see the visualization of the reviews. You can see the different graphs for different submissions.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
*1) Login as instructor&lt;br /&gt;
*2) Create a dummy assignment with some teams. Logout.&lt;br /&gt;
*3) Login as a student and attempt the assignment and logout.&lt;br /&gt;
*4) Repeat step 3 for all dummy teams.&lt;br /&gt;
*5) Login as instructor.&lt;br /&gt;
*6) Go to View Scores page. Check the grade table. &lt;br /&gt;
*7) Click on a column header and check if data in it is getting sorted alphabetically. If yes, then the test passed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Issue 6:''' Since it is a new algorithm being proposed, we intend to test it first manually using dummy datasets. Later on, automated test cases can be added for the same.&lt;br /&gt;
*1) Login as an instructor and create a dummy assignment with 2 rounds of review submissions. Assign 2 teams to that assignment of 1 student each. Also, put the deadline of first review as &amp;quot;1 hour&amp;quot;. Log out&lt;br /&gt;
*2) Login as a student and submit some data for the assignment. Log out.&lt;br /&gt;
*3) Login as the second student and submit the same assignment using dummy data. Log out.&lt;br /&gt;
All the above steps will help us reach towards the section of reviews.&lt;br /&gt;
*4) Login as the first student and go to &amp;quot;Other's work&amp;quot; and request for a review document.By default, you will be assigned the document of the second student to review. Once assigned, complete that review with dummy scores. Log out.&lt;br /&gt;
*5) Login as the second student. Check the review in &amp;quot;Your scores&amp;quot; page and click on the section &amp;quot;Show review&amp;quot;. There you can provide scored feedback to the reviewer. Definitely take into account the tone, solutions suggested and how it will help you. Log out.&lt;br /&gt;
*6) Login as the first student after the deadline for the first review completes. Go to &amp;quot;Other's work&amp;quot; and request for a new submission. You will be assigned the same document of the other student to review for the second round. Follow step 4.&lt;br /&gt;
*7) Login as the second student and follow step 5.&lt;br /&gt;
*8) The system has the data related to author's feedback and the reviewer's data and will calculate the scores of the reviewer according to the ''range of credits'' ( lower range, medium range and higher range) set by the instructor.&lt;br /&gt;
&lt;br /&gt;
*Edge cases - May include options like ''adding a link in the review document by the reviewer'' and seeing how the grade is increased.&lt;br /&gt;
May also take into account the situation when author's feedback isn't available and the score of the reviewer would solely depend on the latter's reviews adn the instructor's jurisdiction because the algorithm in this case would fail.&lt;br /&gt;
&lt;br /&gt;
== '''References''' ==&lt;br /&gt;
*Expertiza repo : https://github.com/expertiza/expertiza&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1780_OSS_Project_Teal_Email_Notification_Enhancements&amp;diff=113691</id>
		<title>CSC/ECE 517 Fall 2017/E1780 OSS Project Teal Email Notification Enhancements</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1780_OSS_Project_Teal_Email_Notification_Enhancements&amp;diff=113691"/>
		<updated>2017-11-20T05:11:57Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1780 OSS Project Teal Email Notification Enhancements.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Expertiza also sends automated emails to the instructor, reviewers and participants for most of the above mentioned activities.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
The following tasks were accomplished in this project:&lt;br /&gt;
:* Instructor should get an option to create a participant if (s)he does not already exist in the system.&lt;br /&gt;
:* E-mails about reviews should direct the user to the page where the review is found.&lt;br /&gt;
:* Deadline reminders should include a link on where to go to perform the needed function.&lt;br /&gt;
:* Instructor notifications of where reviews disagree by more than a threshold # of points should point the instructor to the reviews that disagree.&lt;br /&gt;
:* Send out an email to the invitee when a participant sends out an invitation to another participant to join a team.&lt;br /&gt;
:* All activity on ad responses and invitations should be reported to the other party by e-mail.&lt;br /&gt;
:* Notify an instructor by e-mail when a student suggests a topic.&lt;br /&gt;
:* Instructor should get a copy of all emails sent to the student.&lt;br /&gt;
&lt;br /&gt;
== UML Diagram ==&lt;br /&gt;
[[File:Drawing3.JPG]]&lt;br /&gt;
&lt;br /&gt;
The UML diagram shows the functionality that is being developed in the project.&lt;br /&gt;
*''' Use Case1'''&lt;br /&gt;
*Use Case Description: Create an Option to create a participant when he doesn't exist&lt;br /&gt;
*Actor: Instructor &lt;br /&gt;
*Precondition: Participant does not exist&lt;br /&gt;
*Post Condition:Gets option to create participant&lt;br /&gt;
&lt;br /&gt;
*''' Use Case2'''&lt;br /&gt;
*Use Case Description: Get an Email once a review is done with a link for the review page&lt;br /&gt;
*Actor: Student&lt;br /&gt;
*Precondition: Reviews a team&lt;br /&gt;
*Post Condition:Reviewed team gets emails with review link. On clicking the link , it directs to the corresponding review page&lt;br /&gt;
&lt;br /&gt;
*''' Use Case3'''&lt;br /&gt;
*Use Case Description: Get an Dead line reminder Email once a deadline to review is approaching&lt;br /&gt;
*Actor: Student&lt;br /&gt;
*Precondition: Reviews a team&lt;br /&gt;
*Post Condition: Gets  deadline reminder in email to review a team.Email has the link to the review page&lt;br /&gt;
&lt;br /&gt;
*''' Use Case4'''&lt;br /&gt;
*Use Case Description: Gets email notifications for reviews contradicting by a particular threshold&lt;br /&gt;
*Actor: Instructor&lt;br /&gt;
*Precondition: Team reviews are done by at least 2 reviewers&lt;br /&gt;
*Post Condition: Instructor gets email notifications for reviews contradicting by a particular threshold along with the links the contradicting reviews.&lt;br /&gt;
&lt;br /&gt;
*''' Use Case5'''&lt;br /&gt;
*Use Case Description: Email should be sent to the invitee with the team -join request when join request is sent.&lt;br /&gt;
*Actor: Student&lt;br /&gt;
*Precondition: Invites another participant to join a team&lt;br /&gt;
*Post Condition: Email sent to the invitee with the team -join request&lt;br /&gt;
&lt;br /&gt;
*''' Use Case6'''&lt;br /&gt;
*Use Case Description: All activity Emails should be sent to student&lt;br /&gt;
*Actor: Student&lt;br /&gt;
*Precondition: Activity is done in expertiza by team members.&lt;br /&gt;
*Post Condition: Student receives email on all activity on ad responses and invitations performed by other team members.&lt;br /&gt;
&lt;br /&gt;
*''' Use Case7'''&lt;br /&gt;
*Use Case Description: 	Instructor receives an email for the suggestion request by student&lt;br /&gt;
*Actor: Student&lt;br /&gt;
*Precondition: Suggests a topic&lt;br /&gt;
*Post Condition: Instructor receives an email for the suggestion request&lt;br /&gt;
&lt;br /&gt;
*''' Use Case8'''&lt;br /&gt;
*Use Case Description: 	Instructor receives a copy of email for all the mails sent to student on assignment.&lt;br /&gt;
*Actor: Instructor&lt;br /&gt;
*Precondition: Activity done on assignment.&lt;br /&gt;
*Post Condition: Gets a copy of all emails sent to the student regarding the activities done on assignment.&lt;br /&gt;
&lt;br /&gt;
== Peer Review Information ==&lt;br /&gt;
*The CodeClimate build is failed for the files which are already present, and not for the changes made&lt;br /&gt;
* For the changes to be tested use Instructor Login: username: instructor6 password: password&lt;br /&gt;
*A self explanatory video has been uploaded for the same.&lt;br /&gt;
*Design principles are not needed as we mostly modified existing work.&lt;br /&gt;
*The Project Demo Video is present at : https://drive.google.com/open?id=0B4GJ154MimyPVGxqOWhZYk1mWFE&lt;br /&gt;
&lt;br /&gt;
== Files changed ==&lt;br /&gt;
====&amp;lt;b&amp;gt;Controllers&amp;lt;/b&amp;gt;====                           &lt;br /&gt;
::* invitations_controller.rb                    &lt;br /&gt;
::* profile_controller.rb                        &lt;br /&gt;
::* suggestion_controller.rb&lt;br /&gt;
::* users_controller.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Helpers&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* mailer_helper.rb&lt;br /&gt;
::* login_helper.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Mailers&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* delayed_mailer.rb&lt;br /&gt;
::* mailer.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Models&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* response.rb&lt;br /&gt;
::* user.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Views&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* invite_message.html.erb&lt;br /&gt;
::* new_topic_suggested_message.html.erb&lt;br /&gt;
::* notify_grade_conflict_message.html.erb&lt;br /&gt;
::* _invitation_accepted_html.html.erb&lt;br /&gt;
::* _invitation_declined_html.html.erb&lt;br /&gt;
::* _invitation_pending_html.html.erb&lt;br /&gt;
::* _new_submission_html.html.erb&lt;br /&gt;
::* _submission_deadline_test_html.html.erb&lt;br /&gt;
::* _additional_links.html.erb&lt;br /&gt;
::* add.js.erb&lt;br /&gt;
::* _prefs.html.erb&lt;br /&gt;
====&amp;lt;b&amp;gt;Spec&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* factories.rb&lt;br /&gt;
::* user_spec.rb&lt;br /&gt;
&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
====&amp;lt;b&amp;gt;Providing option to instructor to create non-existent participant&amp;lt;/b&amp;gt;====&lt;br /&gt;
:A flash message was added when the instructor adds a non-existent user as a participant.A new link to redirect the instructor to the user creation page is also added.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:Additional_add.PNG]]&lt;br /&gt;
====&amp;lt;b&amp;gt;Link provided to redirect user to page where review is found&amp;lt;/b&amp;gt;====&lt;br /&gt;
:A  new link was incorporated in the email to redirect the user to the corresponding review page.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:new_submission.png]]&lt;br /&gt;
====&amp;lt;b&amp;gt;Incorporating a link for submission deadline reminder&amp;lt;/b&amp;gt;====&lt;br /&gt;
:Tests were done for already existing functionality as deadline reminders are not being sent as per the current functionality.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:submission_Deadline.png]]&lt;br /&gt;
====Enhancing the email that the instructor receives for contradicting reviews====&lt;br /&gt;
:The email that the instructor receives for contradicting reviews was enhanced by adding the previous average score of the total reviews and the score of the new review. The readability of the email was also increased by adding bullet points wherever necessary.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:conflict1.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:conflict2.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:conflict3.png]]&amp;lt;br&amp;gt;&lt;br /&gt;
====Sending email to the invitee to join a team====&lt;br /&gt;
:When a particular student invites another student(s) to join a team for a particular assignment, then the invitee(s) should receive an email for the same.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:invitation_controller25.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:mailer_helper.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:invite_message.png]]&lt;br /&gt;
====All activities on ad responses and invitations should be reported to the other party by e-mail====&lt;br /&gt;
:Three new partials have been created to send emails to both inviter and invitee for the responses(accept, decline, pending) of invitations to join a team&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:invitation_controllers26.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:inv_acc_dec.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====Emailing to the instructor when a student suggest a new topic====&lt;br /&gt;
:When a new topic is suggested by a student to the instructor , an email is sent to the instructor regarding the same for his decision to approve or decline.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:sugg_controller.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:topic_sugg.png]]&lt;br /&gt;
====Instructor should receive copy of emails being sent to the student if (s)he wishes to====&lt;br /&gt;
:Instructor can choose to receive all the emails being sent to the students to understand proper functioning of the system.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq1.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq2.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq3.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq4.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq5.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq6.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq7.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq8.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq9.png]]&lt;br /&gt;
&lt;br /&gt;
== Testing Details and TestPlan ==&lt;br /&gt;
===Testing with UI(Manual Testing)===&lt;br /&gt;
====Requirement1====&lt;br /&gt;
*If someone attempts to assign a nonexistent user as a participant in an assignment by filling out the form on the Add Participants page, (s)he should be warned that the user does not exist.  This is reasonable behaviour because the username may have been mistyped, and you wouldn't want to create a new user account due to a typo.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
Adding a non-existent user as a participant in an assignment&lt;br /&gt;
*Login as an instructor.&lt;br /&gt;
*Click on an existing active Assignment .&lt;br /&gt;
*Click on Add Participants and add any non-existent user ( any email id ).&lt;br /&gt;
*Result : It gives a flash message regarding user does not exist and gives you the link to create a new user.&lt;br /&gt;
====Requirement2====&lt;br /&gt;
* E-mails about reviews should direct the user to the page where the review is found.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor.&lt;br /&gt;
*Click on an existing active assignment.&lt;br /&gt;
*Add an existing user as a reviewer and another user as a participant to this assignment.&lt;br /&gt;
*Login as the reviewer and submit your review. &lt;br /&gt;
*Result: The participant must have received an email on this review submission with a link to the review submitted.&lt;br /&gt;
====Requirement3====&lt;br /&gt;
*  As there are no deadline reminders implement in expertiza currently , we were asked to verify  the Rspec test cases which are already present in delayed_mailer_spec.rb for the same . We were asked to fix if any bugs were found in that. But no bugs were found.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
* Run the Rspec test cases by  following a command :&lt;br /&gt;
bundle exec rspec delayed_mailer_spec.rb&lt;br /&gt;
*Result : See the attachment for the screenshot.&lt;br /&gt;
====Requirement4====&lt;br /&gt;
*Instructor notifications of where reviews disagree by more than a threshold # of points should point the instructor to the reviews that disagree.&lt;br /&gt;
*Put the links in the bulleted list that is easy to read.&lt;br /&gt;
*The mail should have the previous average score as well as the new score that is being assigned.&lt;br /&gt;
*Any other kind of improvement to increase the readability of the email is welcome&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor and create three students. &lt;br /&gt;
*Hover over Manage Tab --&amp;gt; click on Assignments --&amp;gt; choose an existing active assignment--&amp;gt;Click on Add Participants section of that assignment.&lt;br /&gt;
*Out of the three users that you have created, add two as reviewers and one as participant to that assignment.&lt;br /&gt;
*Login as the participant and submit any link as your work .&lt;br /&gt;
*After the submission deadline is over, login as each of the reviewers and submit two contradicting reviews of the same assignment. ( Say, one gives all 0s , another gives all 5s).&lt;br /&gt;
*Result : This should send an email to the instructor with the below text.&lt;br /&gt;
====Requirement5====&lt;br /&gt;
*Send out an email to the invitee when a participant sends out an invitation to another participant to join a team.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor --&amp;gt; click on Assignments tab --&amp;gt; Select an existing active assignment .&lt;br /&gt;
*Add one user as a participant , login as the participant and invite a student to join the team for the assignment.&lt;br /&gt;
*Result: Email has been sent to invitee.&lt;br /&gt;
====Requirement6====&lt;br /&gt;
*The student who issued the invitation should also be e-mailed when the invitee joins the team. And also when a student responds to a teammate advertisement. In general, all activity on ad responses and invitations should be reported to the other party by e-mail (unless these e-mails are turned off in a (new) profile field).&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Continue from Requirement 5. &lt;br /&gt;
*Result : Both the inviter and the invitee should get an email as per the requirement specified above.&lt;br /&gt;
====Requirement7====&lt;br /&gt;
*Notify an instructor by e-mail when a student suggests a topic.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor--&amp;gt; Create a user with Student role&lt;br /&gt;
*Login as a student--&amp;gt; Click on existing active Assignment .&lt;br /&gt;
*Click on suggest a topic handle and suggest any topic and save it&lt;br /&gt;
*Result : Instructor is notified when the topic is suggested.&lt;br /&gt;
*Login as an instructor--&amp;gt; Create a user with Student role&lt;br /&gt;
*Login as a student--&amp;gt; Click on existing active Assignment .&lt;br /&gt;
*Click on suggest a topic handle and suggest any topic and save it&lt;br /&gt;
*Result : Instructor is notified when the topic is suggested.&lt;br /&gt;
====Requirement8====&lt;br /&gt;
*Create an option (in the instructor’s profile) to get a copy of ‘e-mails being sent to students (this is so the instructor can verify correct functioning of the system).&lt;br /&gt;
*Modified Requirement : Implemented for one such scenario, can easily be replicated for others.&lt;br /&gt;
&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an existing instructor --&amp;gt; go to profile tab.&lt;br /&gt;
*Do not tick the check box mentioning &amp;quot;get copy of all emails&amp;quot;  now. &lt;br /&gt;
*Go to Manage Users --&amp;gt; Create a student.&lt;br /&gt;
*Result 1 (negative testing) : Instructor does not receive an email.&lt;br /&gt;
&lt;br /&gt;
*1. Login as the instructor and go to the Profile tab.&lt;br /&gt;
*2. Tick the check box mentioning &amp;quot;get copy of all emails&amp;quot;  now and Save.&lt;br /&gt;
*3. Perform an event. Here, &amp;quot;Create any other user (may be a Student , Reader ) and Save. While creating the user , please ensure you give an email id that you can access.&lt;br /&gt;
&lt;br /&gt;
*Result2(positive testing) : Both the created user and the instructor receives the email.&lt;br /&gt;
*Note : The instructor only receives copy of all emails happening in the system, if the check box is checked. In this case, we have just implemented for one such scenario . This can easily be replicated for the others.&lt;br /&gt;
&lt;br /&gt;
===Automated Testing Scenario(RSpec)===&lt;br /&gt;
:The test cases for the complete functionality testing are covered.&lt;br /&gt;
:There were existing test cases for the delayed_mailer.rb and the functionality is checked by running those test cases.The snapshot of running the Rspec for the deadline reminder details is given below.&lt;br /&gt;
=====Related snippet&amp;lt;br&amp;gt;[[File:rspec421.png]]=====&lt;br /&gt;
&lt;br /&gt;
== Future Work ==&lt;br /&gt;
#Reviewer is receiving an email after a submission is revised.&lt;br /&gt;
##The above mentioned email should also contain the Review round after which the submission has been revised.&lt;br /&gt;
##The above mentioned email must contain a link that redirects the reviewer to the link where the review is found.&lt;br /&gt;
#Reviewer does not receive the above mentioned email if the last round of review has been completed.&lt;br /&gt;
#Deadline reminder emails should be implemented in the system for both submissions and reviews.&lt;br /&gt;
##A link should be provided in that email so that the user knows where to go to perform the needed action.&lt;br /&gt;
#'''Alternative Approach that can be implemented'''&lt;br /&gt;
##All the changes done in view files could be changed to helpers following the OOD coding standards.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*The Project Demo Video is present at : https://drive.google.com/open?id=0B4GJ154MimyPVGxqOWhZYk1mWFE&lt;br /&gt;
&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza on GitHub]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://relishapp.com/rspec Rspec Documentation]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Drawing3.JPG&amp;diff=113690</id>
		<title>File:Drawing3.JPG</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Drawing3.JPG&amp;diff=113690"/>
		<updated>2017-11-20T05:09:34Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1780_OSS_Project_Teal_Email_Notification_Enhancements&amp;diff=113689</id>
		<title>CSC/ECE 517 Fall 2017/E1780 OSS Project Teal Email Notification Enhancements</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1780_OSS_Project_Teal_Email_Notification_Enhancements&amp;diff=113689"/>
		<updated>2017-11-20T05:09:13Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1780 OSS Project Teal Email Notification Enhancements.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Expertiza also sends automated emails to the instructor, reviewers and participants for most of the above mentioned activities.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
The following tasks were accomplished in this project:&lt;br /&gt;
:* Instructor should get an option to create a participant if (s)he does not already exist in the system.&lt;br /&gt;
:* E-mails about reviews should direct the user to the page where the review is found.&lt;br /&gt;
:* Deadline reminders should include a link on where to go to perform the needed function.&lt;br /&gt;
:* Instructor notifications of where reviews disagree by more than a threshold # of points should point the instructor to the reviews that disagree.&lt;br /&gt;
:* Send out an email to the invitee when a participant sends out an invitation to another participant to join a team.&lt;br /&gt;
:* All activity on ad responses and invitations should be reported to the other party by e-mail.&lt;br /&gt;
:* Notify an instructor by e-mail when a student suggests a topic.&lt;br /&gt;
:* Instructor should get a copy of all emails sent to the student.&lt;br /&gt;
&lt;br /&gt;
== UML Diagram ==&lt;br /&gt;
[[File:Drawing3.JPG]]&lt;br /&gt;
&lt;br /&gt;
The UML diagram shows the functionality that is being developed in the project.&lt;br /&gt;
*''' Use Case1'''&lt;br /&gt;
*Use Case Description: Create an Option to create a participant when he doesn't exist&lt;br /&gt;
*Actor: Instructor &lt;br /&gt;
*Precondition: Participant does not exist&lt;br /&gt;
*Post Condition:Gets option to create participant&lt;br /&gt;
&lt;br /&gt;
*''' Use Case2'''&lt;br /&gt;
*Use Case Description: Get an Email once a review is done with a link for the review page&lt;br /&gt;
*Actor: Student&lt;br /&gt;
*Precondition: Reviews a team&lt;br /&gt;
*Post Condition:Reviewed team gets emails with review link. On clicking the link , it should direct to the corresponding review page&lt;br /&gt;
&lt;br /&gt;
*''' Use Case3'''&lt;br /&gt;
*Use Case Description: Get an Dead line reminder Email once a deadline to review is approaching&lt;br /&gt;
*Actor: Student&lt;br /&gt;
*Precondition: Reviews a team&lt;br /&gt;
*Post Condition: Gets  deadline reminder in email to review a team.Email should have the link to the review page&lt;br /&gt;
&lt;br /&gt;
*''' Use Case4'''&lt;br /&gt;
*Use Case Description: Gets email notifications for reviews contradicting by a particular threshold&lt;br /&gt;
*Actor: Instructor&lt;br /&gt;
*Precondition: Team reviews are done by at least 2 reviewers&lt;br /&gt;
*Post Condition: Instructor gets email notifications for reviews contradicting by a particular threshold along with the links the contradicting reviews.&lt;br /&gt;
&lt;br /&gt;
*''' Use Case5'''&lt;br /&gt;
*Use Case Description: Email should be sent to the invitee with the team -join request when join request is sent.&lt;br /&gt;
*Actor: Student&lt;br /&gt;
*Precondition: Invites another participant to join a team&lt;br /&gt;
*Post Condition: Email should be sent to the invitee with the team -join request&lt;br /&gt;
&lt;br /&gt;
*''' Use Case6'''&lt;br /&gt;
*Use Case Description: All activity Emails should be sent to student&lt;br /&gt;
*Actor: Student&lt;br /&gt;
*Precondition: Activity is done in expertiza by team members.&lt;br /&gt;
*Post Condition: Student Should receive email on all activity on ad responses and invitations performed by other team members.&lt;br /&gt;
&lt;br /&gt;
*''' Use Case7'''&lt;br /&gt;
*Use Case Description: 	Instructor receives an email for the suggestion request by student&lt;br /&gt;
*Actor: Student&lt;br /&gt;
*Precondition: Suggests a topic&lt;br /&gt;
*Post Condition: Instructor receives an email for the suggestion request&lt;br /&gt;
&lt;br /&gt;
*''' Use Case8'''&lt;br /&gt;
*Use Case Description: 	Instructor receives a copy of email for all the mails sent to student on assignment.&lt;br /&gt;
*Actor: Instructor&lt;br /&gt;
*Precondition: Activity done on assignment.&lt;br /&gt;
*Post Condition: Should get a copy of all emails sent to the student regarding the activities done on assignment.&lt;br /&gt;
&lt;br /&gt;
== Peer Review Information ==&lt;br /&gt;
*The CodeClimate build is failed for the files which are already present, and not for the changes made&lt;br /&gt;
* For the changes to be tested use Instructor Login: username: instructor6 password: password&lt;br /&gt;
*A self explanatory video has been uploaded for the same.&lt;br /&gt;
*Design principles are not needed as we mostly modified existing work.&lt;br /&gt;
*The Project Demo Video is present at : https://drive.google.com/open?id=0B4GJ154MimyPVGxqOWhZYk1mWFE&lt;br /&gt;
&lt;br /&gt;
== Files changed ==&lt;br /&gt;
====&amp;lt;b&amp;gt;Controllers&amp;lt;/b&amp;gt;====                           &lt;br /&gt;
::* invitations_controller.rb                    &lt;br /&gt;
::* profile_controller.rb                        &lt;br /&gt;
::* suggestion_controller.rb&lt;br /&gt;
::* users_controller.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Helpers&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* mailer_helper.rb&lt;br /&gt;
::* login_helper.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Mailers&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* delayed_mailer.rb&lt;br /&gt;
::* mailer.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Models&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* response.rb&lt;br /&gt;
::* user.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Views&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* invite_message.html.erb&lt;br /&gt;
::* new_topic_suggested_message.html.erb&lt;br /&gt;
::* notify_grade_conflict_message.html.erb&lt;br /&gt;
::* _invitation_accepted_html.html.erb&lt;br /&gt;
::* _invitation_declined_html.html.erb&lt;br /&gt;
::* _invitation_pending_html.html.erb&lt;br /&gt;
::* _new_submission_html.html.erb&lt;br /&gt;
::* _submission_deadline_test_html.html.erb&lt;br /&gt;
::* _additional_links.html.erb&lt;br /&gt;
::* add.js.erb&lt;br /&gt;
::* _prefs.html.erb&lt;br /&gt;
====&amp;lt;b&amp;gt;Spec&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* factories.rb&lt;br /&gt;
::* user_spec.rb&lt;br /&gt;
&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
====&amp;lt;b&amp;gt;Providing option to instructor to create non-existent participant&amp;lt;/b&amp;gt;====&lt;br /&gt;
:A flash message was added when the instructor adds a non-existent user as a participant.A new link to redirect the instructor to the user creation page is also added.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:Additional_add.PNG]]&lt;br /&gt;
====&amp;lt;b&amp;gt;Link provided to redirect user to page where review is found&amp;lt;/b&amp;gt;====&lt;br /&gt;
:A  new link was incorporated in the email to redirect the user to the corresponding review page.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:new_submission.png]]&lt;br /&gt;
====&amp;lt;b&amp;gt;Incorporating a link for submission deadline reminder&amp;lt;/b&amp;gt;====&lt;br /&gt;
:Tests were done for already existing functionality as deadline reminders are not being sent as per the current functionality.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:submission_Deadline.png]]&lt;br /&gt;
====Enhancing the email that the instructor receives for contradicting reviews====&lt;br /&gt;
:The email that the instructor receives for contradicting reviews was enhanced by adding the previous average score of the total reviews and the score of the new review. The readability of the email was also increased by adding bullet points wherever necessary.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:conflict1.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:conflict2.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:conflict3.png]]&amp;lt;br&amp;gt;&lt;br /&gt;
====Sending email to the invitee to join a team====&lt;br /&gt;
:When a particular student invites another student(s) to join a team for a particular assignment, then the invitee(s) should receive an email for the same.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:invitation_controller25.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:mailer_helper.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:invite_message.png]]&lt;br /&gt;
====All activities on ad responses and invitations should be reported to the other party by e-mail====&lt;br /&gt;
:Three new partials have been created to send emails to both inviter and invitee for the responses(accept, decline, pending) of invitations to join a team&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:invitation_controllers26.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:inv_acc_dec.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====Emailing to the instructor when a student suggest a new topic====&lt;br /&gt;
:When a new topic is suggested by a student to the instructor , an email is sent to the instructor regarding the same for his decision to approve or decline.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:sugg_controller.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:topic_sugg.png]]&lt;br /&gt;
====Instructor should receive copy of emails being sent to the student if (s)he wishes to====&lt;br /&gt;
:Instructor can choose to receive all the emails being sent to the students to understand proper functioning of the system.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq1.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq2.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq3.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq4.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq5.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq6.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq7.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq8.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq9.png]]&lt;br /&gt;
&lt;br /&gt;
== Testing Details and TestPlan ==&lt;br /&gt;
===Testing with UI(Manual Testing)===&lt;br /&gt;
====Requirement1====&lt;br /&gt;
*If someone attempts to assign a nonexistent user as a participant in an assignment by filling out the form on the Add Participants page, (s)he should be warned that the user does not exist.  This is reasonable behaviour because the username may have been mistyped, and you wouldn't want to create a new user account due to a typo.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
Adding a non-existent user as a participant in an assignment&lt;br /&gt;
*Login as an instructor.&lt;br /&gt;
*Click on an existing active Assignment .&lt;br /&gt;
*Click on Add Participants and add any non-existent user ( any email id ).&lt;br /&gt;
*Result : It gives a flash message regarding user does not exist and gives you the link to create a new user.&lt;br /&gt;
====Requirement2====&lt;br /&gt;
* E-mails about reviews should direct the user to the page where the review is found.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor.&lt;br /&gt;
*Click on an existing active assignment.&lt;br /&gt;
*Add an existing user as a reviewer and another user as a participant to this assignment.&lt;br /&gt;
*Login as the reviewer and submit your review. &lt;br /&gt;
*Result: The participant must have received an email on this review submission with a link to the review submitted.&lt;br /&gt;
====Requirement3====&lt;br /&gt;
*  As there are no deadline reminders implement in expertiza currently , we were asked to verify  the Rspec test cases which are already present in delayed_mailer_spec.rb for the same . We were asked to fix if any bugs were found in that. But no bugs were found.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
* Run the Rspec test cases by  following a command :&lt;br /&gt;
bundle exec rspec delayed_mailer_spec.rb&lt;br /&gt;
*Result : See the attachment for the screenshot.&lt;br /&gt;
====Requirement4====&lt;br /&gt;
*Instructor notifications of where reviews disagree by more than a threshold # of points should point the instructor to the reviews that disagree.&lt;br /&gt;
*Put the links in the bulleted list that is easy to read.&lt;br /&gt;
*The mail should have the previous average score as well as the new score that is being assigned.&lt;br /&gt;
*Any other kind of improvement to increase the readability of the email is welcome&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor and create three students. &lt;br /&gt;
*Hover over Manage Tab --&amp;gt; click on Assignments --&amp;gt; choose an existing active assignment--&amp;gt;Click on Add Participants section of that assignment.&lt;br /&gt;
*Out of the three users that you have created, add two as reviewers and one as participant to that assignment.&lt;br /&gt;
*Login as the participant and submit any link as your work .&lt;br /&gt;
*After the submission deadline is over, login as each of the reviewers and submit two contradicting reviews of the same assignment. ( Say, one gives all 0s , another gives all 5s).&lt;br /&gt;
*Result : This should send an email to the instructor with the below text.&lt;br /&gt;
====Requirement5====&lt;br /&gt;
*Send out an email to the invitee when a participant sends out an invitation to another participant to join a team.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor --&amp;gt; click on Assignments tab --&amp;gt; Select an existing active assignment .&lt;br /&gt;
*Add one user as a participant , login as the participant and invite a student to join the team for the assignment.&lt;br /&gt;
*Result: Email has been sent to invitee.&lt;br /&gt;
====Requirement6====&lt;br /&gt;
*The student who issued the invitation should also be e-mailed when the invitee joins the team. And also when a student responds to a teammate advertisement. In general, all activity on ad responses and invitations should be reported to the other party by e-mail (unless these e-mails are turned off in a (new) profile field).&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Continue from Requirement 5. &lt;br /&gt;
*Result : Both the inviter and the invitee should get an email as per the requirement specified above.&lt;br /&gt;
====Requirement7====&lt;br /&gt;
*Notify an instructor by e-mail when a student suggests a topic.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor--&amp;gt; Create a user with Student role&lt;br /&gt;
*Login as a student--&amp;gt; Click on existing active Assignment .&lt;br /&gt;
*Click on suggest a topic handle and suggest any topic and save it&lt;br /&gt;
*Result : Instructor is notified when the topic is suggested.&lt;br /&gt;
*Login as an instructor--&amp;gt; Create a user with Student role&lt;br /&gt;
*Login as a student--&amp;gt; Click on existing active Assignment .&lt;br /&gt;
*Click on suggest a topic handle and suggest any topic and save it&lt;br /&gt;
*Result : Instructor is notified when the topic is suggested.&lt;br /&gt;
====Requirement8====&lt;br /&gt;
*Create an option (in the instructor’s profile) to get a copy of ‘e-mails being sent to students (this is so the instructor can verify correct functioning of the system).&lt;br /&gt;
*Modified Requirement : Implemented for one such scenario, can easily be replicated for others.&lt;br /&gt;
&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an existing instructor --&amp;gt; go to profile tab.&lt;br /&gt;
*Do not tick the check box mentioning &amp;quot;get copy of all emails&amp;quot;  now. &lt;br /&gt;
*Go to Manage Users --&amp;gt; Create a student.&lt;br /&gt;
*Result 1 (negative testing) : Instructor does not receive an email.&lt;br /&gt;
&lt;br /&gt;
*1. Login as the instructor and go to the Profile tab.&lt;br /&gt;
*2. Tick the check box mentioning &amp;quot;get copy of all emails&amp;quot;  now and Save.&lt;br /&gt;
*3. Perform an event. Here, &amp;quot;Create any other user (may be a Student , Reader ) and Save. While creating the user , please ensure you give an email id that you can access.&lt;br /&gt;
&lt;br /&gt;
*Result2(positive testing) : Both the created user and the instructor receives the email.&lt;br /&gt;
*Note : The instructor only receives copy of all emails happening in the system, if the check box is checked. In this case, we have just implemented for one such scenario . This can easily be replicated for the others.&lt;br /&gt;
&lt;br /&gt;
===Automated Testing Scenario(RSpec)===&lt;br /&gt;
:The test cases for the complete functionality testing are covered.&lt;br /&gt;
:There were existing test cases for the delayed_mailer.rb and the functionality is checked by running those test cases.The snapshot of running the Rspec for the deadline reminder details is given below.&lt;br /&gt;
=====Related snippet&amp;lt;br&amp;gt;[[File:rspec421.png]]=====&lt;br /&gt;
&lt;br /&gt;
== Future Work ==&lt;br /&gt;
#Reviewer is receiving an email after a submission is revised.&lt;br /&gt;
##The above mentioned email should also contain the Review round after which the submission has been revised.&lt;br /&gt;
##The above mentioned email must contain a link that redirects the reviewer to the link where the review is found.&lt;br /&gt;
#Reviewer does not receive the above mentioned email if the last round of review has been completed.&lt;br /&gt;
#Deadline reminder emails should be implemented in the system for both submissions and reviews.&lt;br /&gt;
##A link should be provided in that email so that the user knows where to go to perform the needed action.&lt;br /&gt;
#'''Alternative Approach that can be implemented'''&lt;br /&gt;
##All the changes done in view files could be changed to helpers following the OOD coding standards.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*The Project Demo Video is present at : https://drive.google.com/open?id=0B4GJ154MimyPVGxqOWhZYk1mWFE&lt;br /&gt;
&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza on GitHub]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://relishapp.com/rspec Rspec Documentation]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Drawing2.JPG&amp;diff=113681</id>
		<title>File:Drawing2.JPG</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Drawing2.JPG&amp;diff=113681"/>
		<updated>2017-11-20T00:51:36Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1780_OSS_Project_Teal_Email_Notification_Enhancements&amp;diff=113680</id>
		<title>CSC/ECE 517 Fall 2017/E1780 OSS Project Teal Email Notification Enhancements</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1780_OSS_Project_Teal_Email_Notification_Enhancements&amp;diff=113680"/>
		<updated>2017-11-20T00:51:18Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1780 OSS Project Teal Email Notification Enhancements.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Expertiza also sends automated emails to the instructor, reviewers and participants for most of the above mentioned activities.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
The following tasks were accomplished in this project:&lt;br /&gt;
:* Instructor should get an option to create a participant if (s)he does not already exist in the system.&lt;br /&gt;
:* E-mails about reviews should direct the user to the page where the review is found.&lt;br /&gt;
:* Deadline reminders should include a link on where to go to perform the needed function.&lt;br /&gt;
:* Instructor notifications of where reviews disagree by more than a threshold # of points should point the instructor to the reviews that disagree.&lt;br /&gt;
:* Send out an email to the invitee when a participant sends out an invitation to another participant to join a team.&lt;br /&gt;
:* All activity on ad responses and invitations should be reported to the other party by e-mail.&lt;br /&gt;
:* Notify an instructor by e-mail when a student suggests a topic.&lt;br /&gt;
:* Instructor should get a copy of all emails sent to the student.&lt;br /&gt;
&lt;br /&gt;
== UML Case Diagram ==&lt;br /&gt;
[[File:Drawing2.JPG]]&lt;br /&gt;
&lt;br /&gt;
== Peer Review Information ==&lt;br /&gt;
*The CodeClimate build is failed for the files which are already present, and not for the changes made&lt;br /&gt;
* For the changes to be tested use Instructor Login: username: instructor6 password: password&lt;br /&gt;
*A self explanatory video has been uploaded for the same.&lt;br /&gt;
*Design principles are not needed as we mostly modified existing work.&lt;br /&gt;
*The Project Demo Video is present at : https://drive.google.com/open?id=0B4GJ154MimyPVGxqOWhZYk1mWFE&lt;br /&gt;
&lt;br /&gt;
== Files changed ==&lt;br /&gt;
====&amp;lt;b&amp;gt;Controllers&amp;lt;/b&amp;gt;====                           &lt;br /&gt;
::* invitations_controller.rb                    &lt;br /&gt;
::* profile_controller.rb                        &lt;br /&gt;
::* suggestion_controller.rb&lt;br /&gt;
::* users_controller.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Helpers&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* mailer_helper.rb&lt;br /&gt;
::* login_helper.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Mailers&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* delayed_mailer.rb&lt;br /&gt;
::* mailer.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Models&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* response.rb&lt;br /&gt;
::* user.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Views&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* invite_message.html.erb&lt;br /&gt;
::* new_topic_suggested_message.html.erb&lt;br /&gt;
::* notify_grade_conflict_message.html.erb&lt;br /&gt;
::* _invitation_accepted_html.html.erb&lt;br /&gt;
::* _invitation_declined_html.html.erb&lt;br /&gt;
::* _invitation_pending_html.html.erb&lt;br /&gt;
::* _new_submission_html.html.erb&lt;br /&gt;
::* _submission_deadline_test_html.html.erb&lt;br /&gt;
::* _additional_links.html.erb&lt;br /&gt;
::* add.js.erb&lt;br /&gt;
::* _prefs.html.erb&lt;br /&gt;
====&amp;lt;b&amp;gt;Spec&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* factories.rb&lt;br /&gt;
::* user_spec.rb&lt;br /&gt;
&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
====&amp;lt;b&amp;gt;Providing option to instructor to create non-existent participant&amp;lt;/b&amp;gt;====&lt;br /&gt;
:A flash message was added when the instructor adds a non-existent user as a participant.A new link to redirect the instructor to the user creation page is also added.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:Additional_add.PNG]]&lt;br /&gt;
====&amp;lt;b&amp;gt;Link provided to redirect user to page where review is found&amp;lt;/b&amp;gt;====&lt;br /&gt;
:A  new link was incorporated in the email to redirect the user to the corresponding review page.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:new_submission.png]]&lt;br /&gt;
====&amp;lt;b&amp;gt;Incorporating a link for submission deadline reminder&amp;lt;/b&amp;gt;====&lt;br /&gt;
:Tests were done for already existing functionality as deadline reminders are not being sent as per the current functionality.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:submission_Deadline.png]]&lt;br /&gt;
====Enhancing the email that the instructor receives for contradicting reviews====&lt;br /&gt;
:The email that the instructor receives for contradicting reviews was enhanced by adding the previous average score of the total reviews and the score of the new review. The readability of the email was also increased by adding bullet points wherever necessary.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:conflict1.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:conflict2.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:conflict3.png]]&amp;lt;br&amp;gt;&lt;br /&gt;
====Sending email to the invitee to join a team====&lt;br /&gt;
:When a particular student invites another student(s) to join a team for a particular assignment, then the invitee(s) should receive an email for the same.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:invitation_controller25.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:mailer_helper.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:invite_message.png]]&lt;br /&gt;
====All activities on ad responses and invitations should be reported to the other party by e-mail====&lt;br /&gt;
:Three new partials have been created to send emails to both inviter and invitee for the responses(accept, decline, pending) of invitations to join a team&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:invitation_controllers26.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:inv_acc_dec.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====Emailing to the instructor when a student suggest a new topic====&lt;br /&gt;
:When a new topic is suggested by a student to the instructor , an email is sent to the instructor regarding the same for his decision to approve or decline.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:sugg_controller.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:topic_sugg.png]]&lt;br /&gt;
====Instructor should receive copy of emails being sent to the student if (s)he wishes to====&lt;br /&gt;
:Instructor can choose to receive all the emails being sent to the students to understand proper functioning of the system.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq1.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq2.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq3.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq4.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq5.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq6.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq7.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq8.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq9.png]]&lt;br /&gt;
&lt;br /&gt;
== Testing Details and TestPlan ==&lt;br /&gt;
===Testing with UI(Manual Testing)===&lt;br /&gt;
====Requirement1====&lt;br /&gt;
*If someone attempts to assign a nonexistent user as a participant in an assignment by filling out the form on the Add Participants page, (s)he should be warned that the user does not exist.  This is reasonable behaviour because the username may have been mistyped, and you wouldn't want to create a new user account due to a typo.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
Adding a non-existent user as a participant in an assignment&lt;br /&gt;
*Login as an instructor.&lt;br /&gt;
*Click on an existing active Assignment .&lt;br /&gt;
*Click on Add Participants and add any non-existent user ( any email id ).&lt;br /&gt;
*Result : It gives a flash message regarding user does not exist and gives you the link to create a new user.&lt;br /&gt;
====Requirement2====&lt;br /&gt;
* E-mails about reviews should direct the user to the page where the review is found.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor.&lt;br /&gt;
*Click on an existing active assignment.&lt;br /&gt;
*Add an existing user as a reviewer and another user as a participant to this assignment.&lt;br /&gt;
*Login as the reviewer and submit your review. &lt;br /&gt;
*Result: The participant must have received an email on this review submission with a link to the review submitted.&lt;br /&gt;
====Requirement3====&lt;br /&gt;
*  As there are no deadline reminders implement in expertiza currently , we were asked to verify  the Rspec test cases which are already present in delayed_mailer_spec.rb for the same . We were asked to fix if any bugs were found in that. But no bugs were found.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
* Run the Rspec test cases by  following a command :&lt;br /&gt;
bundle exec rspec delayed_mailer_spec.rb&lt;br /&gt;
*Result : See the attachment for the screenshot.&lt;br /&gt;
====Requirement4====&lt;br /&gt;
*Instructor notifications of where reviews disagree by more than a threshold # of points should point the instructor to the reviews that disagree.&lt;br /&gt;
*Put the links in the bulleted list that is easy to read.&lt;br /&gt;
*The mail should have the previous average score as well as the new score that is being assigned.&lt;br /&gt;
*Any other kind of improvement to increase the readability of the email is welcome&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor and create three students. &lt;br /&gt;
*Hover over Manage Tab --&amp;gt; click on Assignments --&amp;gt; choose an existing active assignment--&amp;gt;Click on Add Participants section of that assignment.&lt;br /&gt;
*Out of the three users that you have created, add two as reviewers and one as participant to that assignment.&lt;br /&gt;
*Login as the participant and submit any link as your work .&lt;br /&gt;
*After the submission deadline is over, login as each of the reviewers and submit two contradicting reviews of the same assignment. ( Say, one gives all 0s , another gives all 5s).&lt;br /&gt;
*Result : This should send an email to the instructor with the below text.&lt;br /&gt;
====Requirement5====&lt;br /&gt;
*Send out an email to the invitee when a participant sends out an invitation to another participant to join a team.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor --&amp;gt; click on Assignments tab --&amp;gt; Select an existing active assignment .&lt;br /&gt;
*Add one user as a participant , login as the participant and invite a student to join the team for the assignment.&lt;br /&gt;
*Result: Email has been sent to invitee.&lt;br /&gt;
====Requirement6====&lt;br /&gt;
*The student who issued the invitation should also be e-mailed when the invitee joins the team. And also when a student responds to a teammate advertisement. In general, all activity on ad responses and invitations should be reported to the other party by e-mail (unless these e-mails are turned off in a (new) profile field).&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Continue from Requirement 5. &lt;br /&gt;
*Result : Both the inviter and the invitee should get an email as per the requirement specified above.&lt;br /&gt;
====Requirement7====&lt;br /&gt;
*Notify an instructor by e-mail when a student suggests a topic.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor--&amp;gt; Create a user with Student role&lt;br /&gt;
*Login as a student--&amp;gt; Click on existing active Assignment .&lt;br /&gt;
*Click on suggest a topic handle and suggest any topic and save it&lt;br /&gt;
*Result : Instructor is notified when the topic is suggested.&lt;br /&gt;
*Login as an instructor--&amp;gt; Create a user with Student role&lt;br /&gt;
*Login as a student--&amp;gt; Click on existing active Assignment .&lt;br /&gt;
*Click on suggest a topic handle and suggest any topic and save it&lt;br /&gt;
*Result : Instructor is notified when the topic is suggested.&lt;br /&gt;
====Requirement8====&lt;br /&gt;
*Create an option (in the instructor’s profile) to get a copy of ‘e-mails being sent to students (this is so the instructor can verify correct functioning of the system).&lt;br /&gt;
*Modified Requirement : Implemented for one such scenario, can easily be replicated for others.&lt;br /&gt;
&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an existing instructor --&amp;gt; go to profile tab.&lt;br /&gt;
*Do not tick the check box mentioning &amp;quot;get copy of all emails&amp;quot;  now. &lt;br /&gt;
*Go to Manage Users --&amp;gt; Create a student.&lt;br /&gt;
*Result 1 (negative testing) : Instructor does not receive an email.&lt;br /&gt;
&lt;br /&gt;
*1. Login as the instructor and go to the Profile tab.&lt;br /&gt;
*2. Tick the check box mentioning &amp;quot;get copy of all emails&amp;quot;  now and Save.&lt;br /&gt;
*3. Perform an event. Here, &amp;quot;Create any other user (may be a Student , Reader ) and Save. While creating the user , please ensure you give an email id that you can access.&lt;br /&gt;
&lt;br /&gt;
*Result2(positive testing) : Both the created user and the instructor receives the email.&lt;br /&gt;
*Note : The instructor only receives copy of all emails happening in the system, if the check box is checked. In this case, we have just implemented for one such scenario . This can easily be replicated for the others.&lt;br /&gt;
&lt;br /&gt;
===Automated Testing Scenario(RSpec)===&lt;br /&gt;
:The test cases for the complete functionality testing are covered.&lt;br /&gt;
:There were existing test cases for the delayed_mailer.rb and the functionality is checked by running those test cases.The snapshot of running the Rspec for the deadline reminder details is given below.&lt;br /&gt;
=====Related snippet&amp;lt;br&amp;gt;[[File:rspec421.png]]=====&lt;br /&gt;
&lt;br /&gt;
== Future Work ==&lt;br /&gt;
#Reviewer is receiving an email after a submission is revised.&lt;br /&gt;
##The above mentioned email should also contain the Review round after which the submission has been revised.&lt;br /&gt;
##The above mentioned email must contain a link that redirects the reviewer to the link where the review is found.&lt;br /&gt;
#Reviewer does not receive the above mentioned email if the last round of review has been completed.&lt;br /&gt;
#Deadline reminder emails should be implemented in the system for both submissions and reviews.&lt;br /&gt;
##A link should be provided in that email so that the user knows where to go to perform the needed action.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*The Project Demo Video is present at : https://drive.google.com/open?id=0B4GJ154MimyPVGxqOWhZYk1mWFE&lt;br /&gt;
&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza on GitHub]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://relishapp.com/rspec Rspec Documentation]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Drawing1.JPG&amp;diff=113679</id>
		<title>File:Drawing1.JPG</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Drawing1.JPG&amp;diff=113679"/>
		<updated>2017-11-20T00:43:52Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1780_OSS_Project_Teal_Email_Notification_Enhancements&amp;diff=113678</id>
		<title>CSC/ECE 517 Fall 2017/E1780 OSS Project Teal Email Notification Enhancements</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1780_OSS_Project_Teal_Email_Notification_Enhancements&amp;diff=113678"/>
		<updated>2017-11-20T00:43:17Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This wiki page is for the description of changes made under E1780 OSS Project Teal Email Notification Enhancements.&lt;br /&gt;
&lt;br /&gt;
__TOC__&lt;br /&gt;
&lt;br /&gt;
== About Expertiza ==&lt;br /&gt;
&lt;br /&gt;
[http://expertiza.ncsu.edu/ Expertiza] is an open source project based on [http://rubyonrails.org/ Ruby on Rails] framework. Expertiza allows the instructor to create new assignments and customize new or existing assignments. It also allows the instructor to create a list of topics the students can sign up for. Students can form teams in Expertiza to work on various projects and assignments. Students can also peer review other students' submissions. Expertiza also sends automated emails to the instructor, reviewers and participants for most of the above mentioned activities.&lt;br /&gt;
&lt;br /&gt;
== Problem Statement ==&lt;br /&gt;
The following tasks were accomplished in this project:&lt;br /&gt;
:* Instructor should get an option to create a participant if (s)he does not already exist in the system.&lt;br /&gt;
:* E-mails about reviews should direct the user to the page where the review is found.&lt;br /&gt;
:* Deadline reminders should include a link on where to go to perform the needed function.&lt;br /&gt;
:* Instructor notifications of where reviews disagree by more than a threshold # of points should point the instructor to the reviews that disagree.&lt;br /&gt;
:* Send out an email to the invitee when a participant sends out an invitation to another participant to join a team.&lt;br /&gt;
:* All activity on ad responses and invitations should be reported to the other party by e-mail.&lt;br /&gt;
:* Notify an instructor by e-mail when a student suggests a topic.&lt;br /&gt;
:* Instructor should get a copy of all emails sent to the student.&lt;br /&gt;
&lt;br /&gt;
== UML Case Diagram ==&lt;br /&gt;
[[File:Drawing1.JPG]]&lt;br /&gt;
&lt;br /&gt;
== Peer Review Information ==&lt;br /&gt;
*The CodeClimate build is failed for the files which are already present, and not for the changes made&lt;br /&gt;
* For the changes to be tested use Instructor Login: username: instructor6 password: password&lt;br /&gt;
*A self explanatory video has been uploaded for the same.&lt;br /&gt;
*Design principles are not needed as we mostly modified existing work.&lt;br /&gt;
*The Project Demo Video is present at : https://drive.google.com/open?id=0B4GJ154MimyPVGxqOWhZYk1mWFE&lt;br /&gt;
&lt;br /&gt;
== Files changed ==&lt;br /&gt;
====&amp;lt;b&amp;gt;Controllers&amp;lt;/b&amp;gt;====                           &lt;br /&gt;
::* invitations_controller.rb                    &lt;br /&gt;
::* profile_controller.rb                        &lt;br /&gt;
::* suggestion_controller.rb&lt;br /&gt;
::* users_controller.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Helpers&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* mailer_helper.rb&lt;br /&gt;
::* login_helper.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Mailers&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* delayed_mailer.rb&lt;br /&gt;
::* mailer.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Models&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* response.rb&lt;br /&gt;
::* user.rb&lt;br /&gt;
====&amp;lt;b&amp;gt;Views&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* invite_message.html.erb&lt;br /&gt;
::* new_topic_suggested_message.html.erb&lt;br /&gt;
::* notify_grade_conflict_message.html.erb&lt;br /&gt;
::* _invitation_accepted_html.html.erb&lt;br /&gt;
::* _invitation_declined_html.html.erb&lt;br /&gt;
::* _invitation_pending_html.html.erb&lt;br /&gt;
::* _new_submission_html.html.erb&lt;br /&gt;
::* _submission_deadline_test_html.html.erb&lt;br /&gt;
::* _additional_links.html.erb&lt;br /&gt;
::* add.js.erb&lt;br /&gt;
::* _prefs.html.erb&lt;br /&gt;
====&amp;lt;b&amp;gt;Spec&amp;lt;/b&amp;gt;====&lt;br /&gt;
::* factories.rb&lt;br /&gt;
::* user_spec.rb&lt;br /&gt;
&lt;br /&gt;
== Solutions Implemented ==&lt;br /&gt;
====&amp;lt;b&amp;gt;Providing option to instructor to create non-existent participant&amp;lt;/b&amp;gt;====&lt;br /&gt;
:A flash message was added when the instructor adds a non-existent user as a participant.A new link to redirect the instructor to the user creation page is also added.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:Additional_add.PNG]]&lt;br /&gt;
====&amp;lt;b&amp;gt;Link provided to redirect user to page where review is found&amp;lt;/b&amp;gt;====&lt;br /&gt;
:A  new link was incorporated in the email to redirect the user to the corresponding review page.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:new_submission.png]]&lt;br /&gt;
====&amp;lt;b&amp;gt;Incorporating a link for submission deadline reminder&amp;lt;/b&amp;gt;====&lt;br /&gt;
:Tests were done for already existing functionality as deadline reminders are not being sent as per the current functionality.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:submission_Deadline.png]]&lt;br /&gt;
====Enhancing the email that the instructor receives for contradicting reviews====&lt;br /&gt;
:The email that the instructor receives for contradicting reviews was enhanced by adding the previous average score of the total reviews and the score of the new review. The readability of the email was also increased by adding bullet points wherever necessary.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:conflict1.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:conflict2.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:conflict3.png]]&amp;lt;br&amp;gt;&lt;br /&gt;
====Sending email to the invitee to join a team====&lt;br /&gt;
:When a particular student invites another student(s) to join a team for a particular assignment, then the invitee(s) should receive an email for the same.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:invitation_controller25.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:mailer_helper.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:invite_message.png]]&lt;br /&gt;
====All activities on ad responses and invitations should be reported to the other party by e-mail====&lt;br /&gt;
:Three new partials have been created to send emails to both inviter and invitee for the responses(accept, decline, pending) of invitations to join a team&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:invitation_controllers26.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:inv_acc_dec.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====Emailing to the instructor when a student suggest a new topic====&lt;br /&gt;
:When a new topic is suggested by a student to the instructor , an email is sent to the instructor regarding the same for his decision to approve or decline.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:sugg_controller.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:topic_sugg.png]]&lt;br /&gt;
====Instructor should receive copy of emails being sent to the student if (s)he wishes to====&lt;br /&gt;
:Instructor can choose to receive all the emails being sent to the students to understand proper functioning of the system.&lt;br /&gt;
:&amp;lt;i&amp;gt;Related code snippet&amp;lt;/i&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq1.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq2.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq3.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq4.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq5.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq6.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq7.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq8.png]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;[[File:lastreq9.png]]&lt;br /&gt;
&lt;br /&gt;
== Testing Details and TestPlan ==&lt;br /&gt;
===Testing with UI(Manual Testing)===&lt;br /&gt;
====Requirement1====&lt;br /&gt;
*If someone attempts to assign a nonexistent user as a participant in an assignment by filling out the form on the Add Participants page, (s)he should be warned that the user does not exist.  This is reasonable behaviour because the username may have been mistyped, and you wouldn't want to create a new user account due to a typo.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
Adding a non-existent user as a participant in an assignment&lt;br /&gt;
*Login as an instructor.&lt;br /&gt;
*Click on an existing active Assignment .&lt;br /&gt;
*Click on Add Participants and add any non-existent user ( any email id ).&lt;br /&gt;
*Result : It gives a flash message regarding user does not exist and gives you the link to create a new user.&lt;br /&gt;
====Requirement2====&lt;br /&gt;
* E-mails about reviews should direct the user to the page where the review is found.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor.&lt;br /&gt;
*Click on an existing active assignment.&lt;br /&gt;
*Add an existing user as a reviewer and another user as a participant to this assignment.&lt;br /&gt;
*Login as the reviewer and submit your review. &lt;br /&gt;
*Result: The participant must have received an email on this review submission with a link to the review submitted.&lt;br /&gt;
====Requirement3====&lt;br /&gt;
*  As there are no deadline reminders implement in expertiza currently , we were asked to verify  the Rspec test cases which are already present in delayed_mailer_spec.rb for the same . We were asked to fix if any bugs were found in that. But no bugs were found.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
* Run the Rspec test cases by  following a command :&lt;br /&gt;
bundle exec rspec delayed_mailer_spec.rb&lt;br /&gt;
*Result : See the attachment for the screenshot.&lt;br /&gt;
====Requirement4====&lt;br /&gt;
*Instructor notifications of where reviews disagree by more than a threshold # of points should point the instructor to the reviews that disagree.&lt;br /&gt;
*Put the links in the bulleted list that is easy to read.&lt;br /&gt;
*The mail should have the previous average score as well as the new score that is being assigned.&lt;br /&gt;
*Any other kind of improvement to increase the readability of the email is welcome&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor and create three students. &lt;br /&gt;
*Hover over Manage Tab --&amp;gt; click on Assignments --&amp;gt; choose an existing active assignment--&amp;gt;Click on Add Participants section of that assignment.&lt;br /&gt;
*Out of the three users that you have created, add two as reviewers and one as participant to that assignment.&lt;br /&gt;
*Login as the participant and submit any link as your work .&lt;br /&gt;
*After the submission deadline is over, login as each of the reviewers and submit two contradicting reviews of the same assignment. ( Say, one gives all 0s , another gives all 5s).&lt;br /&gt;
*Result : This should send an email to the instructor with the below text.&lt;br /&gt;
====Requirement5====&lt;br /&gt;
*Send out an email to the invitee when a participant sends out an invitation to another participant to join a team.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor --&amp;gt; click on Assignments tab --&amp;gt; Select an existing active assignment .&lt;br /&gt;
*Add one user as a participant , login as the participant and invite a student to join the team for the assignment.&lt;br /&gt;
*Result: Email has been sent to invitee.&lt;br /&gt;
====Requirement6====&lt;br /&gt;
*The student who issued the invitation should also be e-mailed when the invitee joins the team. And also when a student responds to a teammate advertisement. In general, all activity on ad responses and invitations should be reported to the other party by e-mail (unless these e-mails are turned off in a (new) profile field).&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Continue from Requirement 5. &lt;br /&gt;
*Result : Both the inviter and the invitee should get an email as per the requirement specified above.&lt;br /&gt;
====Requirement7====&lt;br /&gt;
*Notify an instructor by e-mail when a student suggests a topic.&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an instructor--&amp;gt; Create a user with Student role&lt;br /&gt;
*Login as a student--&amp;gt; Click on existing active Assignment .&lt;br /&gt;
*Click on suggest a topic handle and suggest any topic and save it&lt;br /&gt;
*Result : Instructor is notified when the topic is suggested.&lt;br /&gt;
*Login as an instructor--&amp;gt; Create a user with Student role&lt;br /&gt;
*Login as a student--&amp;gt; Click on existing active Assignment .&lt;br /&gt;
*Click on suggest a topic handle and suggest any topic and save it&lt;br /&gt;
*Result : Instructor is notified when the topic is suggested.&lt;br /&gt;
====Requirement8====&lt;br /&gt;
*Create an option (in the instructor’s profile) to get a copy of ‘e-mails being sent to students (this is so the instructor can verify correct functioning of the system).&lt;br /&gt;
*Modified Requirement : Implemented for one such scenario, can easily be replicated for others.&lt;br /&gt;
&lt;br /&gt;
=====Testing Steps=====&lt;br /&gt;
*Login as an existing instructor --&amp;gt; go to profile tab.&lt;br /&gt;
*Do not tick the check box mentioning &amp;quot;get copy of all emails&amp;quot;  now. &lt;br /&gt;
*Go to Manage Users --&amp;gt; Create a student.&lt;br /&gt;
*Result 1 (negative testing) : Instructor does not receive an email.&lt;br /&gt;
&lt;br /&gt;
*1. Login as the instructor and go to the Profile tab.&lt;br /&gt;
*2. Tick the check box mentioning &amp;quot;get copy of all emails&amp;quot;  now and Save.&lt;br /&gt;
*3. Perform an event. Here, &amp;quot;Create any other user (may be a Student , Reader ) and Save. While creating the user , please ensure you give an email id that you can access.&lt;br /&gt;
&lt;br /&gt;
*Result2(positive testing) : Both the created user and the instructor receives the email.&lt;br /&gt;
*Note : The instructor only receives copy of all emails happening in the system, if the check box is checked. In this case, we have just implemented for one such scenario . This can easily be replicated for the others.&lt;br /&gt;
&lt;br /&gt;
===Automated Testing Scenario(RSpec)===&lt;br /&gt;
:The test cases for the complete functionality testing are covered.&lt;br /&gt;
:There were existing test cases for the delayed_mailer.rb and the functionality is checked by running those test cases.The snapshot of running the Rspec for the deadline reminder details is given below.&lt;br /&gt;
=====Related snippet&amp;lt;br&amp;gt;[[File:rspec421.png]]=====&lt;br /&gt;
&lt;br /&gt;
== Future Work ==&lt;br /&gt;
#Reviewer is receiving an email after a submission is revised.&lt;br /&gt;
##The above mentioned email should also contain the Review round after which the submission has been revised.&lt;br /&gt;
##The above mentioned email must contain a link that redirects the reviewer to the link where the review is found.&lt;br /&gt;
#Reviewer does not receive the above mentioned email if the last round of review has been completed.&lt;br /&gt;
#Deadline reminder emails should be implemented in the system for both submissions and reviews.&lt;br /&gt;
##A link should be provided in that email so that the user knows where to go to perform the needed action.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*The Project Demo Video is present at : https://drive.google.com/open?id=0B4GJ154MimyPVGxqOWhZYk1mWFE&lt;br /&gt;
&lt;br /&gt;
*[https://github.com/expertiza/expertiza Expertiza on GitHub]&lt;br /&gt;
*[http://expertiza.ncsu.edu/ The live Expertiza website]&lt;br /&gt;
*[https://relishapp.com/rspec Rspec Documentation]&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112231</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112231"/>
		<updated>2017-11-07T04:52:45Z</updated>

		<summary type="html">&lt;p&gt;Proy4: /* Test Plan: */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Visualisations for instructors =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Issue Statement &amp;amp; Approach followed:''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hove over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== Test Plan: ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:'''&lt;br /&gt;
*1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112230</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112230"/>
		<updated>2017-11-07T04:52:31Z</updated>

		<summary type="html">&lt;p&gt;Proy4: /* Test Plan: */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Visualisations for instructors =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Issue Statement &amp;amp; Approach followed:''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hove over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== Test Plan: ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' *1) Login as the instructor&lt;br /&gt;
*2) Click on the button to compute graphs&lt;br /&gt;
*3) Compare the bar graphs with separate scores of students in each assignments.&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112229</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112229"/>
		<updated>2017-11-07T04:50:59Z</updated>

		<summary type="html">&lt;p&gt;Proy4: /* Issue Statement &amp;amp; Approach followed: */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Visualisations for instructors =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Issue Statement &amp;amp; Approach followed:''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hove over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== Test Plan: ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Workflow.PNG&amp;diff=112228</id>
		<title>File:Workflow.PNG</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Workflow.PNG&amp;diff=112228"/>
		<updated>2017-11-07T04:50:18Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Average.PNG&amp;diff=112227</id>
		<title>File:Average.PNG</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Average.PNG&amp;diff=112227"/>
		<updated>2017-11-07T04:49:58Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment5.PNG&amp;diff=112226</id>
		<title>File:Assignment5.PNG</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment5.PNG&amp;diff=112226"/>
		<updated>2017-11-07T04:49:40Z</updated>

		<summary type="html">&lt;p&gt;Proy4: uploaded a new version of &amp;amp;quot;File:Assignment5.PNG&amp;amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment5.PNG&amp;diff=112225</id>
		<title>File:Assignment5.PNG</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment5.PNG&amp;diff=112225"/>
		<updated>2017-11-07T04:49:40Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment4.PNG&amp;diff=112224</id>
		<title>File:Assignment4.PNG</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment4.PNG&amp;diff=112224"/>
		<updated>2017-11-07T04:49:19Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment3.PNG&amp;diff=112223</id>
		<title>File:Assignment3.PNG</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment3.PNG&amp;diff=112223"/>
		<updated>2017-11-07T04:48:53Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment2.PNG&amp;diff=112222</id>
		<title>File:Assignment2.PNG</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment2.PNG&amp;diff=112222"/>
		<updated>2017-11-07T04:48:31Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112221</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112221"/>
		<updated>2017-11-07T04:48:15Z</updated>

		<summary type="html">&lt;p&gt;Proy4: /* Issue Statement &amp;amp; Approach followed: */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Visualisations for instructors =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Issue Statement &amp;amp; Approach followed:''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hove over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== Test Plan: ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112220</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112220"/>
		<updated>2017-11-07T04:47:46Z</updated>

		<summary type="html">&lt;p&gt;Proy4: /* Issue Statement &amp;amp; Approach followed: */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Visualisations for instructors =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Issue Statement &amp;amp; Approach followed:''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hove over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== Test Plan: ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112219</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112219"/>
		<updated>2017-11-07T04:47:09Z</updated>

		<summary type="html">&lt;p&gt;Proy4: /* Issue Statement &amp;amp; Approach followed: */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Visualisations for instructors =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Issue Statement &amp;amp; Approach followed:''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hove over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== Test Plan: ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112218</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112218"/>
		<updated>2017-11-07T04:46:38Z</updated>

		<summary type="html">&lt;p&gt;Proy4: /* Issue Statement &amp;amp; Approach followed: */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Visualisations for instructors =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Issue Statement &amp;amp; Approach followed:''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hove over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== Test Plan: ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment1.PNG&amp;diff=112217</id>
		<title>File:Assignment1.PNG</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=File:Assignment1.PNG&amp;diff=112217"/>
		<updated>2017-11-07T04:46:10Z</updated>

		<summary type="html">&lt;p&gt;Proy4: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
	<entry>
		<id>https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112216</id>
		<title>CSC/ECE 517 Fall 2017/E1792 OSS Visualizations for instructors</title>
		<link rel="alternate" type="text/html" href="https://wiki.expertiza.ncsu.edu/index.php?title=CSC/ECE_517_Fall_2017/E1792_OSS_Visualizations_for_instructors&amp;diff=112216"/>
		<updated>2017-11-07T04:45:51Z</updated>

		<summary type="html">&lt;p&gt;Proy4: /* Issue Statement &amp;amp; Approach followed: */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Visualisations for instructors =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== '''Issue Statement &amp;amp; Approach followed:''' ==&lt;br /&gt;
'''Issue 1:''' For links in each cell. We are not sure what will happen if we click the link and which page will open.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 1:''' The issue is about the cell contains link to different pages in expertiza, but there is no way to know which cell is linked to which page. So user don't know the destination page when he/she clicks on some page. So we know that there is link in each cell, so we will show a hove over text displaying the link of the destination page.&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' The scale is blue to green, which does not make sense. And colors will change randomly each time loading the page. It will be better to scale from red to green.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 2:''' Here the color coding is blue to green and also it changes randomly. So we will use RBG color coding to make it red to green. We will use percentage to decide color coding. For example if it between 0 - 20% then make it red and so on. This way no matter what scale is being used it will always have appropriate color coding.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Two adjacent bar represents the response in round 1 to round k. It makes sense only if the rubrics in all review rounds are all the same. If the instructor implements the vary-rubric-by-round mechanism, this visualization will not make sense.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 3:''' Here The problem is that the the review responses show in a grid format that shows all the reviews from round 1 to round n. But this is ok when the rubrics is same for all the rounds. But when the rubric is different from round to round this view doesn't make sense. So what we are thinking is creating different grid for each review round. That will solve the problem for when the rubric is different as well as when the rubric is same.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' The table is presorted by teams, but you can now also sort alphabetically. The cell view looks way too long, and should be divided into partials.&lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 4:''' Here we can sort the select query with appropriate column alphabetically, or we can sort the table automatically according to user criteria using dynamic table format. And the long view issue can be solved by using paging.&lt;br /&gt;
&lt;br /&gt;
'''Issue 5:''' An interactive visualization or table that shows how a class performed on selected rubric criteria would be immensely helpful. It would show me what I need to focus more attention on. &lt;br /&gt;
&lt;br /&gt;
'''Approach to solve issue 5 :''' The issue is about getting the performance of the entire class graded using the 5 rubrics criteria.A proposed solution may  be  use the below logic of calculating average grade for each student on all assignments in a particular course  from the controller assessment360_controller.rb &lt;br /&gt;
[[File:scr2.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
The average grades of each student can be pulled and then used to plot a bar graph as shown below ( This has been created by using a dummy data ) :-&lt;br /&gt;
[[File:Dummydata.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 1 :-&lt;br /&gt;
[[File:Assignment1.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 2 :-&lt;br /&gt;
[[File:Assignment2.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 3 :-&lt;br /&gt;
[[File:Assignment3.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 4 :-&lt;br /&gt;
[[File:Assignment4.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Assignment 5 :-&lt;br /&gt;
[[File:Assignment5.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
Graph for class performance on Average Class performance :-&lt;br /&gt;
[[File:Average.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
The work flow can be approximately like the below  diagram :-&lt;br /&gt;
[[File:Workflow.PNG]]&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== Test Plan: ==&lt;br /&gt;
'''Issue 1:''' Check the link matches to the hover over text&lt;br /&gt;
&lt;br /&gt;
'''Issue 2:''' Check the color coding when different number is given in different scale.&lt;br /&gt;
&lt;br /&gt;
'''Issue 3:''' Enter multiple reviews with different rubric and check different table/grid shows for each review round separately.&lt;br /&gt;
&lt;br /&gt;
'''Issue 4:''' Check if the table is sorted with appropriate column alphabetically&lt;/div&gt;</summary>
		<author><name>Proy4</name></author>
	</entry>
</feed>