Abstract
Item omissions in large-scale assessments may occur for various reasons, ranging from disengagement to not being capable of solving the item and giving up. Current response-time-based classification approaches allow researchers to implement different treatments of item omissions presumably going back to different mechanisms. These approaches, however, are limited in that they require a clear-cut decision on the underlying missingness mechanism and do not allow to take the uncertainty in classification into account. We present a response-time-based model-based mixture modeling approach that overcomes this limitation. The approach (a) facilitates disentangling item omissions stemming from disengagement from those going back to solution behavior, (b) considers the uncertainty in omission classification, (c) allows for omission mechanisms to vary on the item-by-examinee level, (d) supports investigating person and item characteristics associated with different types of omission behavior, and (e) gives researchers flexibility in deciding on how to handle different types of omissions. The approach exhibits good parameter recovery under realistic research conditions. We illustrate the approach on data from the Programme for the International Assessment of Adult Competencies 2012 and compare it against previous classification approaches for item omissions.
Original language | English (US) |
---|---|
Pages (from-to) | 599-619 |
Number of pages | 21 |
Journal | Multivariate Behavioral Research |
Volume | 59 |
Issue number | 3 |
Early online date | Apr 9 2024 |
DOIs | |
State | Published - 2024 |
Keywords
- item omissions
- item response theory
- mixture modeling
- Response times
- test-taking engagement
ASJC Scopus subject areas
- Statistics and Probability
- Experimental and Cognitive Psychology
- Arts and Humanities (miscellaneous)