The University of Victoria (UVic) is one of Canada’s leading research-intensive universities and is located in Victoria, Vancouver Island, off the coast of British Columbia. Because of its need to serve both large and small courses, UVic uses Crowdmark in multiple departments to grade faster while giving richer feedback. One instructor in the University of Victoria’s Computer Science department, Ulrike Stege, uses Crowdmark to streamline the assessment process so that her team has more time to provide consistent and high-quality feedback.
The Challenge
Before Crowdmark, UVic downloaded student work from Brightspace and provided feedback by hand. Though this grading process was necessary to give feedback directly on theoretical proofs, Ulrike describes this workflow as being “painfully slow” and “cumbersome.” “I wanted TA’s to do creative things rather than wasting time turning pages and downloading files.”
In addition to its cumbersome nature, this workflow introduced inconsistencies in grading because graders were unable to see what comments and grades were being given for similar student responses.
The Solution
In June 2021, Ulrike started looking for a way to grade efficiently while giving more detailed feedback. She decided to try Crowdmark after hearing about the benefits from a colleague at the University of British Columbia. Ulrike uses the assigned assessment workflow to grade written answers, quizzes, multiple choice questions, midterms, and exams for Computer Science. Not only was Ulrike able to grade all of these assessment types with Crowdmark, but she found that the user-friendly interface made set-up straightforward.
Ulrike also finds that Crowdmark improves the consistency of grading. There are times where a question may be graded too strictly marked and the marking scheme needs to be adjusted. Using the comment library, Ulrike changes the points associated with a comment and this updates any instance of the comment. Similarly, where there is an error within the text, or more detailed feedback may be helpful, the team updates the content of a comment. The team finds that being able to retroactively apply changes to all instances of a comment “is just amazing.” This ability means that fixing errors are easily and quickly rectified, and they do not have to waste time looking through booklets to ensure grades are consistent. This improved experience is in addition to workflow economies.
“The consistency of changing things is great. If you capture a mistake you can fix it everywhere. That alone is so much better than if you have to find those problems.”
In addition to the ease and consistency of feedback, mechanical tasks such as downloading booklets and adding up points are removed when using Crowdmark. Annotations are made directly on the student’s work and comments with negative points are automatically calculated. “Setting up assessments and marking was much more efficient,” and the grading team has more time to focus on providing feedback to improve student learning.
With the grading tools provided, Ulrike and her team better support student learning by providing smarter, more informative comments using Crowdmark. These added comments are reused in different assessments with similar questions. After building a marking scheme through the comment library, Ulrike exports the comments and imports into new assessments – she does not need to spend time recreating comments.
“The feedback is better, and we’ve figured out how to give better feedback. When the students receive quality comments, they understand where they went wrong.”
The Takeaway
The Computer Science department provides better, more consistent feedback for students. It is a better use of TA’s time and skillset as the mechanical, brainless tasks are delegated to Crowdmark.