Using computer adaptive testing to assess physics proficiency and improve exam performance in an introductory physics course

Jason W. Morphew, Jose P. Mestre, Hyeon Ah Kang, Hua Hua Chang, Gregory Fabry

Research output: Contribution to journalArticlepeer-review

Abstract

Prior research has established that students often underprepare for midterm examinations yet remain overconfident in their proficiency. Research concerning the testing effect has demonstrated that utilizing testing as a study strategy leads to higher performance and more accurate confidence compared to more common study strategies such as rereading or reviewing homework problems. We report on three experiments that explore the viability of using computer adaptive testing (CAT) for assessing students' physics proficiency, for preparing students for midterm exams by diagnosing their weaknesses, and for predicting scores in midterm exams in an introductory calculus-based mechanics course for science and engineering majors. The first two experiments evaluated the reliability and validity of the CAT algorithm. In addition, we investigated the ability of the CAT test to predict performance on the midterm exam. The third experiment explored whether completing two CAT tests in the days before a midterm exam would facilitate performance on the midterm exam. Scores on the CAT tests and the midterm exams were significantly correlated and, on average, were not statistically different from each other. This provides evidence for moderate parallel-forms reliability and criterion-related validity of the CAT algorithm. In addition, when used as a diagnostic tool, CAT showed promise in helping students perform better on midterm exams. Finally, we found that the CAT tests predicted the average performance on the midterm exams reasonably well, however, the CAT tests were not as accurate as desired at predicting the performance of individual students. While CAT shows promise for practice testing, more research is needed to refine testing algorithms to increase reliability before implementing CAT for summative evaluations. In light of these findings, we believe that more research is needed comparing CAT to traditional paper-and-pencil practice tests in order to determine whether the effort needed to create a CAT system is worthwhile.

Original languageEnglish (US)
Article number020110
JournalPhysical Review Physics Education Research
Volume14
Issue number2
DOIs
StatePublished - Sep 20 2018

ASJC Scopus subject areas

  • Education
  • Physics and Astronomy(all)

Fingerprint Dive into the research topics of 'Using computer adaptive testing to assess physics proficiency and improve exam performance in an introductory physics course'. Together they form a unique fingerprint.

Cite this