CSC/ECE 517 Fall 2015 E1577 MayYellowRoverJump: Difference between revisions
Line 121: | Line 121: | ||
By clicking the question row, full question text and the scores from each review will be displayed: | By clicking the question row, full question text and the scores from each review will be displayed: | ||
[[File:Rsz_screenshot_2015-12-04_235548.png|border|center|alt=uml.|UML | [[File:Rsz_screenshot_2015-12-04_235548.png|border|center|alt=uml.|UML]] | ||
In <code>review_reviewer.html.erb</code>, detailed reviews will be displayed: | In <code>review_reviewer.html.erb</code>, detailed reviews will be displayed: |
Revision as of 16:16, 9 December 2015
Introduction
Expertiza is an open-source education and classroom web-tool founded by the National Science Foundation. Built with Ruby on Rails, it is designed to manage complete courses and students’ work within those courses. Each course can have a collection of instructors and students, though the interaction between instructors and students is minimal. The real emphasis of Expertiza is placed on peer-to-peer interactions, fostering a student-driven learning environment. Courses are comprised of assignments which users complete individually or with a team. Assignments usually encourage or require a team to enforce practicing peer-to-peer interaction.
One of the main tenets of Expertiza is its implicit peer-review system. Assignments inherently have a review stage where, rather than having instructors review a team’s work, other students review a team’s submission for that assignment. When completing a review, a student is presented with essentially a rubric for the assignment, and they fill in each category with the score they deem commensurate with the work of the team. Of course, each category has a comments box for the student to qualify the score they doled out. Each member of the submitting team is notified of the review, and the team can then decide as a whole how to rework their submission based on the feedback in the peer reviews.
There do exist issues, however, with respect to viewing one’s team’s reviews, particularly in the realm of usability. Our team has been tasked with revamping and enhancing the review UI to produce a more focused and uniform user experience for students and instructors alike.
Assignment
Description
The tasks of the assignment are as follows:
- Compact the review display. Eliminate the blank lines between items within a single review. Instead vary the background color from line to line to improve readability.
- Add the following to the top of each review: who submitted the review. The instructor should see the user’s name and user-ID. A student should see “Reviewer #k”, where k is an integer between 1 and n, the number of reviews that have been submitted for this project. the version number of the review, and the time the review was submitted.
- Add functionality to allow the instructor to view different review rounds. Also, provide instructor with a review report page of all reviewers’ reviews to every project.
- Allow different alternate view: group by question.
- Reduce the duplicate code between instructor and student grade reports.
Purpose
Motivations
- Lack of uniformity between student and instructor views
- No defined separation between reviews
- All reviews and review data (comments, question text, etc.) are displayed at once
Discussion
There is no denying that the usability of viewing peer reviews leaves much to be desired. It lacks uniformity across the student and instructor roles, and the view itself has no semblance of order or organization. Viewing a single student’s review is a chore for both instructors and students, as there is no clear separation between reviews. In addition, all reviews are displayed at once, meaning viewing a single review requires scrolling through the page until the desired review is found. Our goal is to take the same data in the current display and present it in a more focused manner that allows a user, in either the instructor or student role, to absorb and process the content in the peer review more efficiently. Accessing, viewing, and understanding a review should be a far more simple task than what it currently is. In addition to the overhaul of the presentation layer, we also strive to drastically increase code reuse in the controller and model layers of the review module, which will in turn create a more uniform experience for both the instructor and student roles.
Scope
The scope of this task is limited to enhancing the usability of viewing peer reviews for both students and instructors. It is within our scope to modify the corresponding views for this functionality, as well as the underlying controllers and models as needed. The modifications to the Ruby classes will either be to accommodate changes to the view or to provide a uniform experience for both the instructor and student. As this is more of a user experience task, e.g, changing the way data is displayed to the user, there will be limited modifications to the base functionality of this module. It is not within our scope to change any element of the actual peer review process, such as selecting a topic to review or completing a review of a topic. As a result, we will not be modifying the results of peer reviews; the same peer review data will be present both before and after our task is completed.
Design
Discussion of Resolution
The goal of this project is to optimize code and UI of review module, to make it more readable and user-friendly. To be more specific, our work focuses on the following specific areas:
- Refactoring grades_controller.rb and review_mapping_controller.rb to optimize code organization, making code easier to read.
- Modifying UI to be more friendly. Instructor can see users' names and user-IDs, student should see review numbers, like “Reviewer #k” where k is the number of reviews submitted to the project/assignment. Besides, the round of reviews (version number) and submitted time of reviews could also be saw by both students and instructor.
- Modifying UI to make it easier for students/instructor to see reviews of different rounds. Tabs is a good choice, besides drop-down menus are a good alternate. Maybe we also need to modify models to make it adaptive to different rounds of reviews.(Currently, review models will only record the latest version of reviews.)
- Providing a new page to display all reviews of one project/assignment as review report. In this page, reviews will be displayed as a format like “Question 1, Reviews 1, Reviews 2 … Reviews n” (reviewer’s name should also be included here). Besides, here we also need to provide different version/round of reviews of different questions. At the top of this page, there should be a matrix to show the summary of questions(as a row) and reviews(as a column).(How can different version be displayed in the matrix? Using different matrix works?)
- Providing a way to hide or gray the questions, making students/instructor more focus on reviews.
- Providing a search reviews through a specific keyword. And when searching, providing a ‘next’ button to navigate to next keyword place.
- Providing a two-dimensional table to show the scores of each question and reviewer’s name who gives the score to the question.
Mock-Ups
Image 1. The current review display for students.
Image 1, above, illustrates the current design of the Student Review Report. We propose entirely removing the summary statistics at the top of the page. The data in these statistics can be displayed in a more efficient manner.
Image 2. The proposed student review report, in default collapsed view.
The main UI overhaul will occur in displaying the actual reviews. Image 2 illustrates the design for the student review report. To combat the length of the current review display, we have chosen to collapse all data into a single table that can be viewed without the need to endlessly scroll through the page. Each row of the table corresponds to a specific criterion of the review and the scores each reviewer gave with whatever comments they may have offered. The columns dictate which review the score came from. By default, the full criterion text is truncated, while the comments from a review are hidden. Because all of the comments are not available for all reviews in a single screen display, the rightmost field displays the number of comment fields for each criterion which have 10 or more characters. The purpose of this field is to provide the user a quick and effective manner to spark interest into whether or not the the criterion contains meaningful comments.
The colored background of the cells are based on a color scale relative to the score. These colors are added to the design into order to quickly spark interest to the users, allowing them to pick out the essential information without having to iterate through all the data.
Image 3. The proposed student review report, in expanded view.
Clicking the row number will display all comments for that particular criterion, i.e., comments from each review. This way a student can compare comments and scores for a single question across all reviews. Clicking the column header for a specific review will display all the comments left by that reviewer. A student can use this functionality to view the entirety of a single review, including scores and comments.
It is important to note a important data organization difference between the existing design and the proposed design; The existing review report design groups reponses by the reviewer who created them, while the proposed review report design groups responses by the criteria. This new way of grouping responses groups like with like, allow the viewing users to see all scores and comments pertaining to one criterion at once.
Image 4. Current design of the Instructor's Review Report.
The instructor role will also receive interaction updates using the same underling display structures, but the content differ slightly from that of the student. The student view will also be made available to the instructor via a link in the 'grades/view' page, allowing to see this updated scores report for a requested participant.
Design Patterns
The implementation team used following design patterns in the solution:
- Strategy Design Pattern: Using the strategy design pattern, we can display two versions of the review page according to which role is currently logged in. For instance, in the student role’s review page, they will see a different layout than the instructor role's review page, but the actual content is more or less the same.
- Module Design Pattern: Using the module design pattern, source code can be organized into components that accomplish a particular function. In JavaScript files, we use anonymous functions for responsiveness on client side via JavaScript and JQuery. We also developed a view model module to contain the necessary information needed to render the review content on the page.
- Facade Design Pattern: The façade design pattern provides a unified, high-level interface to a set of interfaces in a subsystem which is intended to make the subsystem easier to use.In our implementation, the student role's review page, grades, commands, and feedback are gathered from three different controllers, and they are accessed from within the review controller and displayed in one review page. The client need only interact with the façade, i.e., the review controller.
Use Cases
- View Received Scores and Reviews for Specified Assignment, Grouped by Criteria: Accessed from the Student's landing page for a particular assignment. Allows a student to see scores, grouped by criteria, for self's assignment, as well as meta reviews and author reviews.
Controller Testing
Controller testing will be centered around the proposed views which will replace the "view" and "view_my_scores" view in the grades controller. The Functional testing section covers much of the modular functionality that make up the proposed views. It is not anticipated that the testing team will perform significant controller tests.
Functional Tests
- ColorMapping module: The generation of the heatgrid requires mapping review scores to color groups. Testing of the method's functionality requires various fixture collections which vary is size, range, max, min, and number of duplicates.
- Role Security Testing: In order to DRY the code, and make it maximally orthogonal, the intention is to reuse code where possible. To re-use code across students and participants will require security and role checks to confirm the session user has access to view the requested information. To test this functionality sufficiently will require testing as admin, instructor with access, instructor without access, student, non-participant, participant, teammember, non-teammember, etc.
- Char Comment Count Module: The generation of the "# of comments with chars > x" result will require the execution of a module. this module will be tested with various collections of comments which vary in size, range, length, and content.
- TeamReview(assignment,team): This is a controller helper method, which will be called to generate the peer review scores for a particular team (note: no author nor meta reviews.) A collection of review scores and comments will be generated. This will be tested for empty, null, many round, single round, many reviewer, single reviewer scores.
- ParticpantReview(participant): This is a controller helper method, which will be called to generate the review scores for a particular participant. Collections generated will include reviews, author-reviews, meta-reviews for a participant. this will be tested for empty, null, many round, single round, many reviewer, single reviewer scores.
- RubricQuestion(questionnaire): This is a controller helper method, which will be called by many views in the solution. It will return a list of criteria for each found of a questionnaire or rubric. this will be tested for empty, null, many-round, single round rubrics.
UI Tests
UI will be tested manually, but also automatedly with Selenium. Compliance and validation will be checked via an online tool.
Implementation
Three models were added to retrieve useful data: vm_question_response
, vm_question_response_ row
, and vm_question_response_score_cell
. They comprise a view model module which is used to centralize the data and content that is rendered to the page. The vm_question_response
has multiple vm_question_response_ row
's, which has multiple vm_question_response_score_cell
's.
The vm_question_response_score_cell
is just a data holder. It represents a single cell in the table view and holds the score it should display and it's appropriate color code, which is used by the stylesheet. The vm_question_response_ row
represents a single row in the table view. It holds the question text, the score cells for that row, and the number of comments with more than 10 characters. It is also responsible for calculating the average score across all reviews for that particular question. The vm_question_response
can be thought of as the controller for the table. It holds all the vm_question_response_ row
's in the table and is responsible for calculating various values, such as the total and average scores for each review.
Two core actions were added in grades_controller
: view_team
and view_reviewer
.
The action view_team
is a substitution of the old action view_my_score
, which is used by the student to view all the reviews. This action first gathers the current user, the assignment being viewed, the user's team, and the rounds for the current assignment. It then iterates over the questionnaires, or the reviews, and creates a new view model, which is responsible for rendering the content, populating the data on the view model as needed.
The action view_reviewer
can be used to view the details of all reviews. This action compiles the questions and their respective scores into a hash.
Two major views were added: view_team.html.erb
and view_reviewer.html.erb
. In view_team.html.erb
, HTML table tags combined with javascript are used to generate a table that displays all the scores from reviews, categorized by question. The stylesheet grades.scss
is used to create the heat grid effect, allowing users to quickly hone in on bad scores:
By clicking the question row, full question text and the scores from each review will be displayed:
In review_reviewer.html.erb
, detailed reviews will be displayed:
Future Enhancements
- The project team may attempt to implement sorting of the lists and tables in in both the instructor and student review pages.
- Open Question: how to meet functionality of the various action links in the instructor view, such as: email student, delete score, change score, etc.
References
<references/>
https://en.wikipedia.org/wiki/Code_refactoring
https://en.wikipedia.org/wiki/Flyweight_pattern
https://en.wikipedia.org/wiki/Strategy_pattern
https://en.wikipedia.org/wiki/Front_Controller_pattern
https://en.wikipedia.org/wiki/Module_pattern