CSC/ECE 517 Fall 2021 - Refactor Evaluation of SQL Queries

From Expertiza_Wiki
Revision as of 23:41, 26 October 2021 by Ajshipma (talk | contribs)
Jump to navigation Jump to search

About SQL File Evaluation

SQL File Evaluation is a program developed by DR. Paul Wagner to help with the grading of SQL exams given to students in a database systems class. The program parses through tests given to the students and compares their answer queries to the correct answers for the questions. This tool allows the professor to have tests graded in a matter of seconds that could traditionally take hours to grade. When the software is started it provides a GUI with options to connect to either an Oracle or MySQL database to run the answers against and grade the tests. Instructions must create their own assignment properties files which allows them to distribute points accordingly. Once all the inputs have been entered the software will run all of the tests answers, giving points according to the instructor's assignment properties file and produces an output file for each test as well as an overall summary file that tracks all of the students grades.


Project Goal

The goal of this project was to refactor the code of the back end of the software to improve clarity and readability. the main focus was on the refactoring of the evaluate method, which was part of the BackEnd class. The original version of the evaluate method was almost 200 lines of code and involved many different processes. Dr. Wagner said the method started out as a few lines but as the code evolved, it grew much larger and needed to be refactored. Our plan for the refactoring of the code involved a few key tasks which were:

  • Breaking up sections of code that were focusing on the same task into separate methods
  • Create checking functions for conditionals that contained a lot of ANDs and ORs
  • Change names of variables that are only a single letter or a few letters to more descriptive names
  • Add comments to describe what every section of code is doing
  • Add comments to methods we created to state their purpose and keep the same styling.
  • Ensure all code complies with SOLID principles

Following the refactoring, we also created junit tests to ensure the code was still working with the changes we had made to the code. The testing ensured the proper files were created as well as that all the necessary files were created. additionally the testing checks the summary file and compares it to a summary file that is known to be correct to ensure the summary (and all of the files within it are created properly.


Refactoring

Testing

Testing was a necessary step in working on the open source software because when we received the code it was working and we needed to ensure it still worked after we made the changes in refactoring. Java has a testing frame work called junit that allowed us to create unit tests and verify the code worked. we implemented these tests on the the Backend method, which was the section of code that we refactored.

The first step in creating the test was to set up both a front and and a back end object so the code was able to run. This allowed for the tests to emulate the program when it was actually running. We manually set up all of the information needed for the code to connect to the the proper MySQL database so it could grade the exams. The code below shows how the unit test was set up to begin testing the back end class. The JFXPanel is needed to make sure the test runs properly as if a user was using the software. At the end of the code the evaluate method is called which is what we refactored.

class BackEndTests extends AbstractTest {

	@Test
	void test() {
		try {
			// initialize jfx for test to run
			JFXPanel fxPanel = new JFXPanel();
			
			// set default values for testing an input
			// values will need to be changed for information used in personal testing database
			String dbmsChoice = "MySQL 8.0";
			String dbmsHost = "localhost";
			String dbmsPort = "3306";
			String dbmsSystemID = "evaltest";
			String dbmsUsername = "root";
			String dbmsPassword = "****";
			String evaluationFolder = "C:\\Users\\andrewshipman\\eclipse-workspace\\SQLFE_project";
			String assignPropFile = "assignmentProperties-MySQL";
			FrontEnd f = new FrontEnd();
			
			// set values for the front end used for testing
			f.setDbmsChoice(dbmsChoice);
			f.setDbmsHost(dbmsHost);
			f.setDbmsPort(dbmsPort);
			f.setDbmsSystemID(dbmsSystemID);
			f.setDbmsUsername(dbmsUsername);
			f.setDbmsPassword(dbmsPassword);
			f.setEvaluationFolder(evaluationFolder);
			f.setAssignPropFile(assignPropFile);
			
			// setup back end and run evaluation
			BackEnd b = new BackEnd(f);
			f.setABackEnd(b);
			b.transferData(f);
			b.evaluate();

After the evaluate method is completed the testing is ready to begin to validate that the evaluate method did its job correctly. This is done in a few ways. The first way is that the program checks to make sure the summary file was created. This provides proof that the evaluate method at least ran and tried to create something. the code for this test is fairly simple:

// test if output file was created
File testFile = new File(f.getEvaluationFolder() + "\\evaluations\\AAA_grade_summary.out");
assertTrue(testFile.exists());

The next test checks to see that the program created the proper number of output files. This should be 1 for each exam that was submitted into the files folder for evaluation plus an additional 3 files one for the summary, one for any problems that were found in the parser, and one containing the student comments. The code for this test is:

// test if there is the proper number of files
File inputFiles = new File(f.getEvaluationFolder() + "\\files");
File outputFiles = new File(f.getEvaluationFolder() + "\\Evaluations");
assertEquals(inputFiles.list().length, (outputFiles.list().length - 3));

The final test checked the output summary of the program against a file that was known to be correct for the first 5 sample files that came with the java code. This test used an API for comparing the contents of 2 files to see if they are the same or different. The correct file is located in another folder in the project called TestingFiles. The code for this test is below:

// test if grade summary output is correct for 5 files
// uses file in testing file to assert output is correct for 5 evaluations
File compareFile = new File(f.getEvaluationFolder() + "\\TestingFiles\\Correct_5_evals_summary.out");
assertTrue(FileUtils.contentEquals(testFile, compareFile));

all of the tests passed along with manual testing and checking of all 66 sample files included with the project. This meant our refactoring achieved the same goal of the original program.