TY - JOUR
T1 - MCMC stopping rules in latent variable modelling
AU - Kwon, Sunbeom
AU - Zhang, Susu
AU - Köhn, Hans Friedrich
AU - Zhang, Bo
N1 - Publisher Copyright:
© 2024 The Author(s). British Journal of Mathematical and Statistical Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.
PY - 2024/10/10
Y1 - 2024/10/10
N2 - Bayesian analysis relies heavily on the Markov chain Monte Carlo (MCMC) algorithm to obtain random samples from posterior distributions. In this study, we compare the performance of MCMC stopping rules and provide a guideline for determining the termination point of the MCMC algorithm in latent variable models. In simulation studies, we examine the performance of four different MCMC stopping rules: potential scale reduction factor (PSRF), fixed-width stopping rule, Geweke's diagnostic, and effective sample size. Specifically, we evaluate these stopping rules in the context of the DINA model and the bifactor item response theory model, two commonly used latent variable models in educational and psychological measurement. Our simulation study findings suggest that single-chain approaches outperform multiple-chain approaches in terms of item parameter accuracy. However, when it comes to person parameter estimates, the effect of stopping rules diminishes. We caution against relying solely on the univariate PSRF, which is the most popular method, as it may terminate the algorithm prematurely and produce biased item parameter estimates if the cut-off value is not chosen carefully. Our research offers guidance to practitioners on choosing suitable stopping rules to improve the precision of the MCMC algorithm in models involving latent variables.
AB - Bayesian analysis relies heavily on the Markov chain Monte Carlo (MCMC) algorithm to obtain random samples from posterior distributions. In this study, we compare the performance of MCMC stopping rules and provide a guideline for determining the termination point of the MCMC algorithm in latent variable models. In simulation studies, we examine the performance of four different MCMC stopping rules: potential scale reduction factor (PSRF), fixed-width stopping rule, Geweke's diagnostic, and effective sample size. Specifically, we evaluate these stopping rules in the context of the DINA model and the bifactor item response theory model, two commonly used latent variable models in educational and psychological measurement. Our simulation study findings suggest that single-chain approaches outperform multiple-chain approaches in terms of item parameter accuracy. However, when it comes to person parameter estimates, the effect of stopping rules diminishes. We caution against relying solely on the univariate PSRF, which is the most popular method, as it may terminate the algorithm prematurely and produce biased item parameter estimates if the cut-off value is not chosen carefully. Our research offers guidance to practitioners on choosing suitable stopping rules to improve the precision of the MCMC algorithm in models involving latent variables.
KW - bifactor IRT model
KW - DINA model
KW - effective sample size
KW - Gelman–Rubin diagnostic
KW - Geweke's diagnostic
KW - MCMC algorithm
KW - Monte Carlo standard error
UR - http://www.scopus.com/inward/record.url?scp=85205840507&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85205840507&partnerID=8YFLogxK
U2 - 10.1111/bmsp.12357
DO - 10.1111/bmsp.12357
M3 - Article
C2 - 39387464
AN - SCOPUS:85205840507
SN - 0007-1102
JO - British Journal of Mathematical and Statistical Psychology
JF - British Journal of Mathematical and Statistical Psychology
ER -