TY - GEN
T1 - Caches as an Example of Machine-gradable Exam Questions for Complex Engineering Systems
AU - Mahmood, Suleman
AU - Zhao, Mingjie
AU - Khan, Omar
AU - Herman, Geoffrey L.
N1 - Publisher Copyright:
© 2020 IEEE.
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2020/10/21
Y1 - 2020/10/21
N2 - This Innovative Practice Full Paper presents a framework for generating computer-based exams for complex engineering systems (such as cache memories) that can be machine graded while still offering partial credit for students. Complex multi-faceted engineering systems often require long, multi-part problems to fully assess students' understanding of those systems. Cache memories represent one such system in computer architecture courses. Traditionally, we assessed students' understanding of caches using comprehensive, multipart questions in a paper-based exam. Grading these exams was time-consuming and relied on subjective grading. To cope with rising enrollment, we sought to address these issues by developing machine administered and gradable exams that did not heavily rely on multiple-choice questions or exact numerical responses. Additionally, this system needed to provide partial credit, a common expectation of our students. We developed a cache simulator to use as a back-end for our questions. We used the simulator to develop exam questions and new homework assignments to help students practice cache memory concepts. To give students access to fair partial credit, we allowed multiple submissions for the exam questions with limited feedback. We also awarded partial credit for answers within certain tolerance of the correct answer. The partial credit awarded reduced as deviation from the correct answer increased. Consequently, students could correct minor mistakes or propagating errors which are common reasons for awarding partial credit. To evaluate the effect of the switch from paper-based to computerized exam, we ported questions from one of our paper-based exams to a computerized exam. We evaluated the differences in student performance on paper-based version and the computerized version of the questions and found mixed results with students performing comparably or better than the paper-based exam on the computer-based exam. We also surveyed students about their experience with the computer-based exam. Students overwhelmingly indicated a preference for the computer-based exam. We believe that ideas from our work can be used to automate generation, administration, and grading of complex multi-part questions in engineering disciplines beyond computer architecture.
AB - This Innovative Practice Full Paper presents a framework for generating computer-based exams for complex engineering systems (such as cache memories) that can be machine graded while still offering partial credit for students. Complex multi-faceted engineering systems often require long, multi-part problems to fully assess students' understanding of those systems. Cache memories represent one such system in computer architecture courses. Traditionally, we assessed students' understanding of caches using comprehensive, multipart questions in a paper-based exam. Grading these exams was time-consuming and relied on subjective grading. To cope with rising enrollment, we sought to address these issues by developing machine administered and gradable exams that did not heavily rely on multiple-choice questions or exact numerical responses. Additionally, this system needed to provide partial credit, a common expectation of our students. We developed a cache simulator to use as a back-end for our questions. We used the simulator to develop exam questions and new homework assignments to help students practice cache memory concepts. To give students access to fair partial credit, we allowed multiple submissions for the exam questions with limited feedback. We also awarded partial credit for answers within certain tolerance of the correct answer. The partial credit awarded reduced as deviation from the correct answer increased. Consequently, students could correct minor mistakes or propagating errors which are common reasons for awarding partial credit. To evaluate the effect of the switch from paper-based to computerized exam, we ported questions from one of our paper-based exams to a computerized exam. We evaluated the differences in student performance on paper-based version and the computerized version of the questions and found mixed results with students performing comparably or better than the paper-based exam on the computer-based exam. We also surveyed students about their experience with the computer-based exam. Students overwhelmingly indicated a preference for the computer-based exam. We believe that ideas from our work can be used to automate generation, administration, and grading of complex multi-part questions in engineering disciplines beyond computer architecture.
KW - Cache Simulator
KW - Computerized Testing
UR - http://www.scopus.com/inward/record.url?scp=85098544768&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85098544768&partnerID=8YFLogxK
U2 - 10.1109/FIE44824.2020.9273822
DO - 10.1109/FIE44824.2020.9273822
M3 - Conference contribution
AN - SCOPUS:85098544768
T3 - Proceedings - Frontiers in Education Conference, FIE
BT - 2020 IEEE Frontiers in Education Conference, FIE 2020 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE Frontiers in Education Conference, FIE 2020
Y2 - 21 October 2020 through 24 October 2020
ER -