TY - JOUR
T1 - Computerized exam reviews
T2 - 2020 ASEE Virtual Annual Conference, ASEE 2020
AU - Chang, Wayne L.
AU - West, Matthew
AU - Zilles, Craig
AU - Mussulman, David
AU - Sacris, Carleen
N1 - Publisher Copyright:
© American Society for Engineering Education 2020.
PY - 2020/6/22
Y1 - 2020/6/22
N2 - Computerized testing centers are a promising new technology for running exams in large (200+ students) courses. They eliminate many of the logistical problems of pencil-and-paper exams; no conflict exams need to be scheduled, exams are graded efficiently and consistently, and timely feedback is provided to students. Computerized testing can be used to dramatically shorten the feedback cycle between student learning and feedback from assessment and enables the use of frequent testing and second-chance testing in large courses, which has been shown to lead to significant improvements in learning outcomes. However, in some courses involving mathematical problem solving, an important student dissatisfaction with computerized testing is that numerical-answer questions are typically graded solely on the correctness of the final answer. The two major concerns reported by students are: (1) limited access to the assessment and corresponding learning opportunities post-assessment, and (2) the lack of partial credit given for correct solution procedures with incorrect final answers. To address these concerns from students, a large public Midwestern university has developed a new exam-review service to provide in-person feedback to students after the completion of computerized exams, with the option of human-assigned partial credit for a correct solution procedure. These review sessions are hosted in the computerized testing facility to ensure the integrity of exam problems for future use. In these review sessions, students are able to go through their scratch work collected at the end of the exam, and any program code they wrote to solve problems, under the guidance of a course staff member. The format of the session is student guided in nature, where course staff are present to assist with the identification of conceptual errors. In this paper, we present the design of the review system in a large-scale computerized testing facility, including the scheduling logistics, software support, course staff training, and guidance to students. Detailed data from student usage is reported, including survey data of student desire for exam review and the degree to which our system addresses this desire. We find that usage of exam review sessions to depend on three factors: the difficulty of the exam, whether exam regrading was offered during the session, and if a retry exam was available the following week.
AB - Computerized testing centers are a promising new technology for running exams in large (200+ students) courses. They eliminate many of the logistical problems of pencil-and-paper exams; no conflict exams need to be scheduled, exams are graded efficiently and consistently, and timely feedback is provided to students. Computerized testing can be used to dramatically shorten the feedback cycle between student learning and feedback from assessment and enables the use of frequent testing and second-chance testing in large courses, which has been shown to lead to significant improvements in learning outcomes. However, in some courses involving mathematical problem solving, an important student dissatisfaction with computerized testing is that numerical-answer questions are typically graded solely on the correctness of the final answer. The two major concerns reported by students are: (1) limited access to the assessment and corresponding learning opportunities post-assessment, and (2) the lack of partial credit given for correct solution procedures with incorrect final answers. To address these concerns from students, a large public Midwestern university has developed a new exam-review service to provide in-person feedback to students after the completion of computerized exams, with the option of human-assigned partial credit for a correct solution procedure. These review sessions are hosted in the computerized testing facility to ensure the integrity of exam problems for future use. In these review sessions, students are able to go through their scratch work collected at the end of the exam, and any program code they wrote to solve problems, under the guidance of a course staff member. The format of the session is student guided in nature, where course staff are present to assist with the identification of conceptual errors. In this paper, we present the design of the review system in a large-scale computerized testing facility, including the scheduling logistics, software support, course staff training, and guidance to students. Detailed data from student usage is reported, including survey data of student desire for exam review and the degree to which our system addresses this desire. We find that usage of exam review sessions to depend on three factors: the difficulty of the exam, whether exam regrading was offered during the session, and if a retry exam was available the following week.
UR - http://www.scopus.com/inward/record.url?scp=85095765416&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85095765416&partnerID=8YFLogxK
U2 - 10.18260/1-2--34321
DO - 10.18260/1-2--34321
M3 - Conference article
AN - SCOPUS:85095765416
SN - 2153-5965
VL - 2020-June
JO - ASEE Annual Conference and Exposition, Conference Proceedings
JF - ASEE Annual Conference and Exposition, Conference Proceedings
M1 - 365
Y2 - 22 June 2020 through 26 June 2020
ER -