CSC/ECE 517 Fall 2016 E1709 Visualizations for instructors: Difference between revisions
(8 intermediate revisions by 2 users not shown) | |||
Line 44: | Line 44: | ||
==== | ==== Final Design==== | ||
* When the | * When the instructor will select the show review report, it will show him the screen shown below. | ||
[[File: | [[File:Expertiza1.png]] | ||
==== Implementation | ==== Implementation ==== | ||
This report is an | This report is an enhancement of the review report. | ||
* A new | * A new metric filter is provided so that the instructor can select the average author feedback, average length of comments, check and open the file if it is added by the reviewer. | ||
* | * For the above implementation we have modified the following files: | ||
* | **/app/views/review_mapping/_review_report.html.erb | ||
** We | **/app/views/review_mapping/_searchbox.html.erb | ||
**app/controller/review_mapping_controller.rb | |||
**app/helper/review_mapping_helper.rb | |||
* All the above data will be rendered | * The instructor can now also sort the reviewer by the name. | ||
* We have taken the data from ReviewResponseMap, FeedbackResponseMap, AssignmentParticipant, Response models. | |||
*Hyperlinks | * From ReviewResponseMap and AssignmentParticipant, for each reviewer we will get number of reviews completed, length of reviews, summary of reviews and whether reviewers had added a file or link for their review. | ||
* From FeedbackResponseMap and Response, we will get the number of author feedbacks given to a reviewer. Using this we will get the average feedback score for a particular reviewer from all the feedbacks. For this we created a new method called as "get_author_feedback_score_hash", that will return a hash consisting of reviewer id and round number as composite key and the score for each round as value. | |||
* All the above data will be rendered in "/app/views/_review_report.html.erb". | |||
*Hyperlinks are provided where necessary, so that the instructor can view additional details. For e.g. to view review summary or author feedback summary. | |||
==== | ==== Pull Request and Demo ==== | ||
= | The demo for the Review report can be seen here: https://www.youtube.com/watch?v=mRgT9vdIizk | ||
The pull request is here: https://github.com/expertiza/expertiza/pull/870. | |||
=== Class Performance Report === | === Class Performance Report === | ||
Line 74: | Line 78: | ||
==== | ==== Final Design ==== | ||
* The instructor can view the class performance on assignments by clicking on the graph icon on the assignments page as shown below. | * The instructor can view the class performance on assignments by clicking on the graph icon on the assignments page as shown below. | ||
Line 115: | Line 119: | ||
==== Pull Request and Demo ==== | ==== Pull Request and Demo ==== | ||
The demo for the class performance report can be seen here: https://youtu.be/_fp4YYCgY2w | The demo for the class performance report can be seen here: https://youtu.be/_fp4YYCgY2w |
Latest revision as of 03:45, 7 December 2016
Introduction
Expertiza<ref>https://expertiza.ncsu.edu/</ref> is an open-source web application to create re-usable learning objects through peer-reviews to facilitate incremental learning. Students can submit learning objects such as articles, wiki pages, repository links and with the help of peer reviews, improve them. The project has been developed using the Ruby on Rails<ref>https://en.wikipedia.org/wiki/Ruby_on_Rails</ref> framework and is supported by the National Science Foundation.
Project Description
Purpose and Scope
Expertiza assignments are based on a peer review system where the instructor creates rubrics for an assignment through questionnaires which students use to review other students' submissions. The author of the submission is given an opportunity to provide feedback about these reviews. All questions in the questionnaire have an assigned grade which is based on a pre-defined grading scale. Instructors can see a report on the scores of a student given by reviewers, on the score of feedback given to the reviewers and many other reports. Based on these reports (which are all on separate pages), the instructors grade the student. These reports are, however, on separate pages and it is difficult for an instructor to navigate to many pages before grading the student. The first requirement is meant to solve this problem by merging the review scores with the author feedbacks on the same page.
Furthermore, there is no way for an instructor to evaluate the rubric he/she has posted on assignments. Students may consistently be getting negative reviews for a rubric which might not be totally relevant to the assignment thereby reducing their scores. A report of average class scores for each rubric in the questionnaires would help instructors refactor their rubric and understand where students generally perform well and where they struggle. This new report forms the second part of the requirement.
We are not modifying any of the existing functionalities of Expertiza. Our work would involve modifying the review report and creating a new report for average class scores.
Task Description
The project requires completion of the following tasks:
- Integrate review data with author feedback data to help instructors grade the reviewers.
- Create a new table for review and author feedback report.
- The new table should have the following information: reviewer name, the number of reviews he has done, length of reviews, review summary, whether there is a link or a file attached, Average author feedback rating per team, Author feedback summary and a field where an instructor can give his grades and write comments.
- Add interactive visualization to show the class performance of an assignment to an instructor.
- Create a new route/view/controller for class performance.
- Add a new link to point to the new controller created. This new link will be created per assignment.
- Create two new views, one for selecting rubric criteria and second to show the graph.
- Create graphs to show the class performance as per the rubric metrics selected dynamically.
Project Design
Design Patterns
Iterator Pattern<ref>https://en.wikipedia.org/wiki/Iterator_pattern</ref>: The iterator design pattern uses an iterator to traverse a container and access its elements. When we are implementing the response and author feedback report, we will be iterating through each reviewer to get the review performed by them and then each author based on the feedback given for each review. This iteration will occur with the data returned by the ResponseMap model for all of the review and feedback information. For the class performance report, we will be iterating through each questionnaire per assignment, and thereafter each question per questionnaire. The same iteration will also be required to get answers per question per reviewer.
MVC Pattern<ref>https://www.tutorialspoint.com/design_pattern/mvc_pattern.htm</ref>: In the MVC design pattern a controller processes the request, interprets the data in model and then renders particular view. For rendering response and author feedback report, we check the the data, in the form that was submitted from the UI, in the ResponseMappingController. Depending on the data, we process various models and then display a particular view.
Review and Author Feedback Report
Workflow
Final Design
- When the instructor will select the show review report, it will show him the screen shown below.
Implementation
This report is an enhancement of the review report.
- A new metric filter is provided so that the instructor can select the average author feedback, average length of comments, check and open the file if it is added by the reviewer.
- For the above implementation we have modified the following files:
- /app/views/review_mapping/_review_report.html.erb
- /app/views/review_mapping/_searchbox.html.erb
- app/controller/review_mapping_controller.rb
- app/helper/review_mapping_helper.rb
- The instructor can now also sort the reviewer by the name.
- We have taken the data from ReviewResponseMap, FeedbackResponseMap, AssignmentParticipant, Response models.
- From ReviewResponseMap and AssignmentParticipant, for each reviewer we will get number of reviews completed, length of reviews, summary of reviews and whether reviewers had added a file or link for their review.
- From FeedbackResponseMap and Response, we will get the number of author feedbacks given to a reviewer. Using this we will get the average feedback score for a particular reviewer from all the feedbacks. For this we created a new method called as "get_author_feedback_score_hash", that will return a hash consisting of reviewer id and round number as composite key and the score for each round as value.
- All the above data will be rendered in "/app/views/_review_report.html.erb".
- Hyperlinks are provided where necessary, so that the instructor can view additional details. For e.g. to view review summary or author feedback summary.
Pull Request and Demo
The demo for the Review report can be seen here: https://www.youtube.com/watch?v=mRgT9vdIizk The pull request is here: https://github.com/expertiza/expertiza/pull/870.
Class Performance Report
Workflow
Final Design
- The instructor can view the class performance on assignments by clicking on the graph icon on the assignments page as shown below.
- Once you click on the graph icon, it will take the instructor to the page shown below where the instructor can select various rubric questions used for evaluation of that assignment.
- Once you click on the graph icon, it will take the instructor to the page shown below where the instructor can see the performance of the class based on various selected rubric questions.
Implementation
In order to implement the above functionality for the class performance report, we have finally implemented the following:
- We added a new controller ClassPerformanceController.
- We added the following two new views.
- A view to select_rubrics. This view will allow instructors to select a number of rubrics to evaluate the class performance on.
- A view to show_class_performance. This view will display the class performance using relevant graphs to represent the information clearly.
- Routes for each of the views created.
We need to provide a link to the instructor to see this view. As shown above, this will be a button in the assignment management page routed at tree_display/list which corresponds to the function list in the controller TreeDisplayController.rb. The button we added here routes to the newly created select_rubrics view.
The select_rubrics view receives the following parameters.
assignment_id
The controller gets all the questionnaires related to that assignment from the AssignmentQuestionnaire model. It then gets a list of all the rubrics used in those questionnaires from the Questions model. These are displayed to the instructor. It then routes the instructor to the show_class_performance view upon selection of rubrics. It will pass the following parameters to the show_class_performance view.
assignment_id Array[question_id]
The controller will take the list of questions for the assignment and find all the answers of those questions. It will then calculate the average score per question for the entire class from those answers and pass this to the view. The show_class_performance view will use the Gruff API in order to provide an aesthetically appealing visualization of the data.
Pull Request and Demo
The demo for the class performance report can be seen here: https://youtu.be/_fp4YYCgY2w The pull request is here: https://github.com/expertiza/expertiza/pull/861
Use Cases
- View Review and Author Feedback Report as Instructor: As an instructor, he can see the different metrics of reviews and average feedback rating received per student done for an assignment or a project.
- View Class Performance as Instructor: As an instructor, he can select 5 rubric metrics used per assignment. The instructor is able to see the graph to check the class performance based upon the metrics selected.
Test Plan
- For use case 1, test if the instructor can see the text metrics of reviews and author feedbacks received for an assignment or a project per student.
- For use case 2, test if the instructor can any number of rubric metrics used for an assignment. Also, test if the instructor can view the class performance from a graph using the metrics selected by the instructor.
Requirements
While software and hardware requirements are the same as current Expertiza system, we will require following addition tools:
- Tools: Gruff API in addition to the current Expertiza system.
References
<references/>