Consistency Theory for the General Nonparametric Classification Method

Chia Yi Chiu, Hans Friedrich Köhn

Research output: Contribution to journalArticlepeer-review

Abstract

Parametric likelihood estimation is the prevailing method for fitting cognitive diagnosis models—also called diagnostic classification models (DCMs). Nonparametric concepts and methods that do not rely on a parametric statistical model have been proposed for cognitive diagnosis. These methods are particularly useful when sample sizes are small. The general nonparametric classification (GNPC) method for assigning examinees to proficiency classes can accommodate assessment data conforming to any diagnostic classification model that describes the probability of a correct item response as an increasing function of the number of required attributes mastered by an examinee (known as the “monotonicity assumption”). Hence, the GNPC method can be used with any model that can be represented as a general DCM. However, the statistical properties of the estimator of examinees’ proficiency class are currently unknown. In this article, the consistency theory of the GNPC proficiency-class estimator is developed and its statistical consistency is proven.

Original languageEnglish (US)
Pages (from-to)830-845
Number of pages16
JournalPsychometrika
Volume84
Issue number3
DOIs
StatePublished - Sep 15 2019

Keywords

  • DINA model
  • DINO model
  • G-DINA model
  • Q-matrix
  • cognitive diagnosis
  • general DCM
  • general nonparametric classification method
  • nonparametric classification

ASJC Scopus subject areas

  • General Psychology
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Consistency Theory for the General Nonparametric Classification Method'. Together they form a unique fingerprint.

Cite this