Caches as an Example of Machine-gradable Exam Questions for Complex Engineering Systems

Suleman Mahmood, Mingjie Zhao, Omar Khan, Geoffrey L. Herman

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This Innovative Practice Full Paper presents a framework for generating computer-based exams for complex engineering systems (such as cache memories) that can be machine graded while still offering partial credit for students. Complex multi-faceted engineering systems often require long, multi-part problems to fully assess students' understanding of those systems. Cache memories represent one such system in computer architecture courses. Traditionally, we assessed students' understanding of caches using comprehensive, multipart questions in a paper-based exam. Grading these exams was time-consuming and relied on subjective grading. To cope with rising enrollment, we sought to address these issues by developing machine administered and gradable exams that did not heavily rely on multiple-choice questions or exact numerical responses. Additionally, this system needed to provide partial credit, a common expectation of our students. We developed a cache simulator to use as a back-end for our questions. We used the simulator to develop exam questions and new homework assignments to help students practice cache memory concepts. To give students access to fair partial credit, we allowed multiple submissions for the exam questions with limited feedback. We also awarded partial credit for answers within certain tolerance of the correct answer. The partial credit awarded reduced as deviation from the correct answer increased. Consequently, students could correct minor mistakes or propagating errors which are common reasons for awarding partial credit. To evaluate the effect of the switch from paper-based to computerized exam, we ported questions from one of our paper-based exams to a computerized exam. We evaluated the differences in student performance on paper-based version and the computerized version of the questions and found mixed results with students performing comparably or better than the paper-based exam on the computer-based exam. We also surveyed students about their experience with the computer-based exam. Students overwhelmingly indicated a preference for the computer-based exam. We believe that ideas from our work can be used to automate generation, administration, and grading of complex multi-part questions in engineering disciplines beyond computer architecture.

Original languageEnglish (US)
Title of host publication2020 IEEE Frontiers in Education Conference, FIE 2020 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728189611
DOIs
StatePublished - Oct 21 2020
Event2020 IEEE Frontiers in Education Conference, FIE 2020 - Uppsala, Sweden
Duration: Oct 21 2020Oct 24 2020

Publication series

NameProceedings - Frontiers in Education Conference, FIE
Volume2020-October
ISSN (Print)1539-4565

Conference

Conference2020 IEEE Frontiers in Education Conference, FIE 2020
CountrySweden
CityUppsala
Period10/21/2010/24/20

Keywords

  • Cache Simulator
  • Computerized Testing

ASJC Scopus subject areas

  • Software
  • Education
  • Computer Science Applications

Fingerprint Dive into the research topics of 'Caches as an Example of Machine-gradable Exam Questions for Complex Engineering Systems'. Together they form a unique fingerprint.

Cite this