TY - GEN
T1 - Exploring joint maximum likelihood estimation for cognitive diagnosis models
AU - Chiu, Chia Yi
AU - Koehn, Hans Friedrich
AU - Zheng, Yi
AU - Henson, Robert
N1 - Publisher Copyright:
© Springer International Publishing Switzerland 2015.
PY - 2015/8/8
Y1 - 2015/8/8
N2 - Current methods for fitting cognitive diagnosis models (CDMs) to educational data typically rely on expectation maximization (EM) or Markov chain Monte Carlo (MCMC) for estimating the item parameters and examinees’ proficiency class memberships. However, for advanced, more complex CDMs like the reduced reparameterized unified model (Reduced RUM) and the (saturated) loglinear cognitive diagnosis model (LCDM), EM and Markov chain Monte Carlo (MCMC) have the reputation of often consuming excessive CPU times. Joint maximum likelihood estimation (JMLE) is proposed as an alternative to EM and MCMC. The maximization of the joint likelihood is typically accomplished in a few iterations, thereby drastically reducing the CPU times usually needed for fitting advanced CDMs like the Reduced RUM or the (saturated) LCDM. As another attractive feature, the JMLE algorithm presented here resolves the traditional issue of JMLE estimators—their lack of statistical consistency—by using an external, statistically consistent estimator to obtain initial estimates of examinees’ class memberships as starting values. It can be proven that under this condition the JMLE item parameter estimators are also statistically consistent. The computational performance of the proposed JMLE algorithm is evaluated in two comprehensive simulation studies.
AB - Current methods for fitting cognitive diagnosis models (CDMs) to educational data typically rely on expectation maximization (EM) or Markov chain Monte Carlo (MCMC) for estimating the item parameters and examinees’ proficiency class memberships. However, for advanced, more complex CDMs like the reduced reparameterized unified model (Reduced RUM) and the (saturated) loglinear cognitive diagnosis model (LCDM), EM and Markov chain Monte Carlo (MCMC) have the reputation of often consuming excessive CPU times. Joint maximum likelihood estimation (JMLE) is proposed as an alternative to EM and MCMC. The maximization of the joint likelihood is typically accomplished in a few iterations, thereby drastically reducing the CPU times usually needed for fitting advanced CDMs like the Reduced RUM or the (saturated) LCDM. As another attractive feature, the JMLE algorithm presented here resolves the traditional issue of JMLE estimators—their lack of statistical consistency—by using an external, statistically consistent estimator to obtain initial estimates of examinees’ class memberships as starting values. It can be proven that under this condition the JMLE item parameter estimators are also statistically consistent. The computational performance of the proposed JMLE algorithm is evaluated in two comprehensive simulation studies.
KW - Cognitive diagnosis
KW - Consistency
KW - Joint maximum likelihood estimation
KW - Nonparametric classification
UR - http://www.scopus.com/inward/record.url?scp=84950254585&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84950254585&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-19977-1_19
DO - 10.1007/978-3-319-19977-1_19
M3 - Conference contribution
AN - SCOPUS:84950254585
SN - 9783319199764
VL - 140
T3 - Springer Proceedings in Mathematics and Statistics
SP - 263
EP - 277
BT - Quantitative Psychology Research
A2 - van der Ark, L. Andries
A2 - Wang, Wen-Chung
A2 - Douglas, Jeffrey A.
A2 - Bolt, Daniel M.
A2 - Chow, Sy-Miin
PB - Springer
T2 - 79th Annual International Meeting of the Psychometric Society, IMPS 2014
Y2 - 21 July 2014 through 25 July 2014
ER -