CSC/ECE 517 Fall 2025 - E2562. Review grading dashboard
Background
The codebase for this reimplementation effort can be found in the following repositories:
The Review Grading Dashboard is a modernized instructor interface that simplifies grading peer reviews by consolidating key review information, grading tools, and performance metrics into a single, sortable page.
Problem Statement
The current Expertiza system makes grading peer reviews cumbersome and inefficient. Review data such as reviewer details, completion status, and scores are scattered across multiple pages, forcing instructors to navigate between views to assign grades.
The goal of this project is to design and implement a dedicated Review Grading Dashboard that provides instructors and TAs with a unified page to:
- View all reviews submitted by students.
- Check review completion status and assigned scores.
- Analyze review metrics such as “volume” (unique word usage).
- Enter grades and comments for reviews.
- Export review scores and grades to a CSV file.
This dashboard serves as the central grading hub for instructors, replacing the older fragmented grading workflow.
Requirements
Core Requirements
- The dashboard is accessible only to users with the Instructor or TA role.
- The dashboard displays a table containing the following columns:
- Reviewer: Username and full name of the reviewer.
- Reviews Done: Number of reviews completed out of the total assigned.
- Team Reviewed: Each reviewed team, color-coded by review completion status (e.g., done, pending, late).
- Scores: Weighted average review scores per round, calculated via the existing `aggregate_questionnaire_score` method.
- Metrics: A column chart comparing the number of unique words used by the reviewer to the average review volume.
- Grade: Text box for instructors to enter a numeric grade.
- Comments: Text box for instructors to write feedback or grading rationale.
- Each column should be sortable (e.g., Reviewer, Reviews Done, Scores).
- Table rows should have alternating background colors for readability.
- A button should allow instructors to export all grades and scores as a CSV file.
- If there are multiple review rounds, display one score and metric per round.
Design Requirements
- Follow the same design guidelines as the reimplementation-front-end design document.
- Use compact column widths to prevent unnecessary wrapping.
- Implement hover tooltips to show extended reviewer or metric details.
- Use consistent UI elements (text boxes, buttons) from the existing frontend.
- For the "Metrics" column chart:
- Charting library: Recharts (React-based, lightweight, responsive).
- The chart should compare each student’s review “volume” with the average.
- Maintain color accessibility and responsiveness across devices.
Implementation
Frontend
- Implemented using React + TypeScript as part of the reimplementation front-end.
- Fetch review and grading data from new API endpoints.
- Use Recharts for the Metrics visualization.
- Build a sortable, paginated table with reusable table and form components.
Backend
- Implemented using Ruby on Rails.
- Create the following REST API endpoints:
- `/api/v1/review_dashboard_data` (GET) – Fetch all review-related data.
- `/api/v1/assign_review_grade` (POST) – Submit grades and comments.
- `/api/v1/export_reviews_csv` (GET) – Export data to CSV.
- Optionally add a ResponseVolumeMixin to the `Response` model to calculate “volume,” defined as the count of unique words in a student’s review.
Data Flow
Frontend (React) ↓ GET /review_dashboard_data Backend (Rails) → Returns JSON with reviewer info, scores, and metrics Frontend ↓ Renders sortable table + charts Instructor Input → POST /assign_review_grade Backend → Stores grade + supports CSV export
Results
(To be completed upon implementation)
Expected outcomes:
- A clean, sortable dashboard interface for instructors.
- Visual metrics to compare review quality and engagement.
- Streamlined grading workflow with direct input and CSV export.
Test Plan
Manual Testing
- Log in as an instructor or TA.
- Navigate to the course dashboard and click the Review Grading Dashboard link.
- Verify that:
- Reviewer names, teams reviewed, and completion status are displayed correctly.
- Team color codes correspond to review completion.
- Scores are accurate per round.
- The metrics chart renders correctly using Recharts.
- Grades and comments can be entered and saved successfully.
- Click the “Export CSV” button and ensure the CSV file contains accurate data.
- Test sorting functionality for multiple columns.
- Confirm that alternating row colors and hover highlights appear correctly.
Automated Testing
- Backend (RSpec):
- Test API responses for correctness and completeness.
- Validate `volume()` method returns expected unique word count.
- Verify CSV export structure and content.
- Frontend (Jest + React Testing Library):
- Test rendering of dashboard table and Recharts components.
- Verify sorting and input functionalities.
- Mock API calls to test grade submission.
Team Members
- Deekshith Anantha
- Srinidhi
- Abhishek Rao