Exploratory Restricted Latent Class Models with Monotonicity Requirements under PÒLYA–GAMMA Data Augmentation

James Joseph Balamuta, Steven Andrew Culpepper

Research output: Contribution to journalArticlepeer-review


Restricted latent class models (RLCMs) provide an important framework for supporting diagnostic research in education and psychology. Recent research proposed fully exploratory methods for inferring the latent structure. However, prior research is limited by the use of restrictive monotonicity condition or prior formulations that are unable to incorporate prior information about the latent structure to validate expert knowledge. We develop new methods that relax existing monotonicity restrictions and provide greater insight about the latent structure. Furthermore, existing Bayesian methods only use a probit link function and we provide a new formulation for using the exploratory RLCM with a logit link function that has an additional advantage of being computationally more efficient for larger sample sizes. We present four new Bayesian formulations that employ different link functions (i.e., the logit using the Pòlya–gamma data augmentation versus the probit) and priors for inducing sparsity in the latent structure. We report Monte Carlo simulation studies to demonstrate accurate parameter recovery. Furthermore, we report results from an application to the Last Series of the Standard Progressive Matrices to illustrate our new methods.

Original languageEnglish (US)
Pages (from-to)903-945
Number of pages43
Issue number3
StatePublished - Sep 2022


  • Bayesian
  • Pòlya–gamma data augmentation
  • restricted latent class models

ASJC Scopus subject areas

  • Applied Mathematics
  • General Psychology


Dive into the research topics of 'Exploratory Restricted Latent Class Models with Monotonicity Requirements under PÒLYA–GAMMA Data Augmentation'. Together they form a unique fingerprint.

Cite this