Computerized exam reviews: In-person and individualized feedback to students after a computerized exam

Wayne L. Chang, Matthew West, Craig Zilles, David Mussulman, Carleen Sacris

Research output: Contribution to journalConference articlepeer-review

Abstract

Computerized testing centers are a promising new technology for running exams in large (200+ students) courses. They eliminate many of the logistical problems of pencil-and-paper exams; no conflict exams need to be scheduled, exams are graded efficiently and consistently, and timely feedback is provided to students. Computerized testing can be used to dramatically shorten the feedback cycle between student learning and feedback from assessment and enables the use of frequent testing and second-chance testing in large courses, which has been shown to lead to significant improvements in learning outcomes. However, in some courses involving mathematical problem solving, an important student dissatisfaction with computerized testing is that numerical-answer questions are typically graded solely on the correctness of the final answer. The two major concerns reported by students are: (1) limited access to the assessment and corresponding learning opportunities post-assessment, and (2) the lack of partial credit given for correct solution procedures with incorrect final answers. To address these concerns from students, a large public Midwestern university has developed a new exam-review service to provide in-person feedback to students after the completion of computerized exams, with the option of human-assigned partial credit for a correct solution procedure. These review sessions are hosted in the computerized testing facility to ensure the integrity of exam problems for future use. In these review sessions, students are able to go through their scratch work collected at the end of the exam, and any program code they wrote to solve problems, under the guidance of a course staff member. The format of the session is student guided in nature, where course staff are present to assist with the identification of conceptual errors. In this paper, we present the design of the review system in a large-scale computerized testing facility, including the scheduling logistics, software support, course staff training, and guidance to students. Detailed data from student usage is reported, including survey data of student desire for exam review and the degree to which our system addresses this desire. We find that usage of exam review sessions to depend on three factors: the difficulty of the exam, whether exam regrading was offered during the session, and if a retry exam was available the following week.

Original languageEnglish (US)
Article number365
JournalASEE Annual Conference and Exposition, Conference Proceedings
Volume2020-June
DOIs
StatePublished - Jun 22 2020
Event2020 ASEE Virtual Annual Conference, ASEE 2020 - Virtual, Online
Duration: Jun 22 2020Jun 26 2020

ASJC Scopus subject areas

  • General Engineering

Fingerprint

Dive into the research topics of 'Computerized exam reviews: In-person and individualized feedback to students after a computerized exam'. Together they form a unique fingerprint.

Cite this