CSC/ECE 517 Spring 2026 - E2609. Review calibration
CSC/ECE 517 Spring 2026 - E2609. Review calibration
About Expertiza
Expertiza is an open-source project based on the Ruby on Rails framework. It allows instructors to create and customize assignments, manage student teams, and handle peer review workflows. Expertiza supports diverse submission types including URLs and wiki pages, and provides a robust framework for student assessment.
Problem Statement
A “calibration assignment” is an assignment where students are asked to review work that has been previously reviewed by a member of the course staff. If the student’s review “resembles” the staff-member’s review, then the student is presumed to be a competent reviewer.
The goals for this part of the project were:
- Implement the backend logic to handle calibration participants and expert reviews.
- Add a mechanism to link a student (as a calibration participant) to an instructor's expert review.
- Ensure the system supports the "for_calibration" distinction in ResponseMaps.
- Provide JSON endpoints for listing participants and their submitted content to support the front-end Vue.js interface.
About Calibration Response Maps
Calibration Response Maps are a specialized type of feedback mapping. Unlike standard peer reviews where students review each other, a calibration map links an instructor (as the reviewer) to a specific piece of work (via a student participant) that serves as the "gold standard" for the assignment. Setting the for_calibration flag to true allows the system to distinguish these expert reviews for future comparison with student reviews.
Current Implementation
Previously, the system lacked a dedicated controller to manage the creation and retrieval of calibration-specific response maps in the reimplemented back-end. While the database supported response mapping, there was no streamlined way to:
- Idempotently add a calibration participant.
- Automatically generate the instructor-to-participant mapping with the correct calibration flags.
- Retrieve submitted content (links/files) specifically for calibration work.
New Implementation
Database Changes
A migration was implemented to add the for_calibration boolean flag to the response_maps table:
class AddForCalibrationToResponseMaps < ActiveRecord::Migration[8.0]
def change
add_column :response_maps, :for_calibration, :boolean, default: false, null: false
end
end
Controller: CalibrationResponseMapsController
This new controller handles the logic for calibration management. Key methods include:
index
Lists all calibration response maps for an assignment where the current instructor is the reviewer.
- Logic: Filters
ResponseMapwherereviewed_object_idis the assignment ID,reviewer_idis the instructor's participant ID, andfor_calibrationis true. - Response: Returns JSON with nested
revieweeanduserdata.
create
Handles adding a new calibration participant and setting up the expert review map.
- Logic:
- Validates the provided
username. - Finds or creates an
AssignmentParticipantfor that user. - Ensures the Instructor is also a participant in the assignment.
- Creates/updates a
ResponseMapwithfor_calibration: true.
- Response: Returns the participant, response map, and a
team_payloadcontaining submitted hyperlinks.
begin
Determines if a calibration review is new or an edit of an existing response.
- Logic: Checks for an existing
Responseassociated with the map and returns the appropriate redirect path (e.g.,/response/new/:map_id).
Part 2: Review Comparison & Reporting (Future Work)
Proposed Features
- Review Comparison View:
- [PLACEHOLDER] Logic to fetch and compare a student's
Responseobject with an instructor'sResponseobject for the same submission. - [PLACEHOLDER] JSON endpoint to provide detailed score-by-score comparison data.
- [PLACEHOLDER] Logic to fetch and compare a student's
- Calibration Summary Report:
- [PLACEHOLDER] Logic to iterate over all student reviews for a calibration submission.
- [PLACEHOLDER] Aggregation logic to calculate alignment with the instructor's scores.
- [PLACEHOLDER] Support for graphical reporting of score distribution (agreements vs. deviations).
Future Reporting Endpoints
- GET
/assignments/:assignment_id/calibration_comparison/:response_id[PROPOSED]:- To be implemented: Returns JSON comparing a specific student's review to the instructor's review.
- GET
/assignments/:assignment_id/calibration_summary/:map_id[PROPOSED]:- To be implemented: Returns JSON summary of all student reviews for a specific calibration work.
Testing and Verification
Automated Testing (RSpec)
Extensive integration tests were added in spec/requests/api/v1/calibration_response_maps_spec.rb to verify the API endpoints.
- Success Paths: Verified 200 OK for listing maps and 201 Created for adding participants.
- Authorization: Ensured that students are blocked (403 Forbidden) from accessing calibration management actions.
- Error Handling: Tested 404 Not Found scenarios for invalid assignment IDs or usernames.
To run the automated tests:
rspec spec/requests/api/v1/calibration_response_maps_spec.rb
Manual UI Testing
The backend changes can be verified using the following steps:
- Login as an Instructor or TA.
- Navigate to an assignment's Calibration tab.
- Add Participant: Enter a username (e.g., "student123") in the text box.
- Verify Appearance: The user should appear in the list of calibration participants.
- Begin Review: Click the "Begin" link. It should redirect you to the review editor for that submission.