CSC/ECE 517 Spring 2026 - E2609. Review calibration

From Expertiza_Wiki
Jump to navigation Jump to search

CSC/ECE 517 Spring 2026 - E2609. Review calibration

About Expertiza

Expertiza is an open-source project based on the Ruby on Rails framework. It allows instructors to create and customize assignments, manage student teams, and handle peer review workflows. Expertiza supports diverse submission types including URLs and wiki pages, and provides a robust framework for student assessment.

Problem Statement

A “calibration assignment” is an assignment where students are asked to review work that has been previously reviewed by a member of the course staff. If the student’s review “resembles” the staff-member’s review, then the student is presumed to be a competent reviewer.

The goals for this part of the project were:

  • Implement the backend logic to handle calibration participants and expert reviews.
  • Add a mechanism to link a student (as a calibration participant) to an instructor's expert review.
  • Ensure the system supports the "for_calibration" distinction in ResponseMaps.
  • Provide JSON endpoints for listing participants and their submitted content to support the front-end Vue.js interface.

About Calibration Response Maps

Calibration Response Maps are a specialized type of feedback mapping. Unlike standard peer reviews where students review each other, a calibration map links an instructor (as the reviewer) to a specific piece of work (via a student participant) that serves as the "gold standard" for the assignment. Setting the for_calibration flag to true allows the system to distinguish these expert reviews for future comparison with student reviews.

Current Implementation

Previously, the system lacked a dedicated controller to manage the creation and retrieval of calibration-specific response maps in the reimplemented back-end. While the database supported response mapping, there was no streamlined way to:

  • Idempotently add a calibration participant.
  • Automatically generate the instructor-to-participant mapping with the correct calibration flags.
  • Retrieve submitted content (links/files) specifically for calibration work.

New Implementation

Database Changes

A migration was implemented to add the for_calibration boolean flag to the response_maps table:

class AddForCalibrationToResponseMaps < ActiveRecord::Migration[8.0]
  def change
    add_column :response_maps, :for_calibration, :boolean, default: false, null: false
  end
end

Controller: CalibrationResponseMapsController

This new controller handles the logic for calibration management. Key methods include:

index

Lists all calibration response maps for an assignment where the current instructor is the reviewer.

  • Logic: Filters ResponseMap where reviewed_object_id is the assignment ID, reviewer_id is the instructor's participant ID, and for_calibration is true.
  • Response: Returns JSON with nested reviewee and user data.

create

Handles adding a new calibration participant and setting up the expert review map.

  • Logic:
  1. Validates the provided username.
  2. Finds or creates an AssignmentParticipant for that user.
  3. Ensures the Instructor is also a participant in the assignment.
  4. Creates/updates a ResponseMap with for_calibration: true.
  • Response: Returns the participant, response map, and a team_payload containing submitted hyperlinks.

begin

Determines if a calibration review is new or an edit of an existing response.

  • Logic: Checks for an existing Response associated with the map and returns the appropriate redirect path (e.g., /response/new/:map_id).

Part 2: Review Comparison & Reporting (Future Work)

Proposed Features

  • Review Comparison View:
    • [PLACEHOLDER] Logic to fetch and compare a student's Response object with an instructor's Response object for the same submission.
    • [PLACEHOLDER] JSON endpoint to provide detailed score-by-score comparison data.
  • Calibration Summary Report:
    • [PLACEHOLDER] Logic to iterate over all student reviews for a calibration submission.
    • [PLACEHOLDER] Aggregation logic to calculate alignment with the instructor's scores.
    • [PLACEHOLDER] Support for graphical reporting of score distribution (agreements vs. deviations).

Future Reporting Endpoints

  • GET /assignments/:assignment_id/calibration_comparison/:response_id [PROPOSED]:
    • To be implemented: Returns JSON comparing a specific student's review to the instructor's review.
  • GET /assignments/:assignment_id/calibration_summary/:map_id [PROPOSED]:
    • To be implemented: Returns JSON summary of all student reviews for a specific calibration work.

Testing and Verification

Automated Testing (RSpec)

Extensive integration tests were added in spec/requests/api/v1/calibration_response_maps_spec.rb to verify the API endpoints.

  • Success Paths: Verified 200 OK for listing maps and 201 Created for adding participants.
  • Authorization: Ensured that students are blocked (403 Forbidden) from accessing calibration management actions.
  • Error Handling: Tested 404 Not Found scenarios for invalid assignment IDs or usernames.

To run the automated tests:

rspec spec/requests/api/v1/calibration_response_maps_spec.rb

Manual UI Testing

The backend changes can be verified using the following steps:

  1. Login as an Instructor or TA.
  2. Navigate to an assignment's Calibration tab.
  3. Add Participant: Enter a username (e.g., "student123") in the text box.
  4. Verify Appearance: The user should appear in the list of calibration participants.
  5. Begin Review: Click the "Begin" link. It should redirect you to the review editor for that submission.

References