Kullback-Leibler Information and Its Applications in Multi-Dimensional Adaptive Testing

Chun Wang, Hua Hua Chang, Keith A. Boughton

Research output: Contribution to journalArticlepeer-review

Abstract

This paper first discusses the relationship between Kullback-Leibler information (KL) and Fisher information in the context of multi-dimensional item response theory and is further interpreted for the two-dimensional case, from a geometric perspective. This explication should allow for a better understanding of the various item selection methods in multi-dimensional adaptive tests (MAT) which are based on these two information measures. The KL information index (KI) method is then discussed and two theorems are derived to quantify the relationship between KI and item parameters. Due to the fact that most of the existing item selection algorithms for MAT bear severe computational complexity, which substantially lowers the applicability of MAT, two versions of simplified KL index (SKI), built from the analytical results, are proposed to mimic the behavior of KI, while reducing the overall computational intensity.

Original languageEnglish (US)
Pages (from-to)13-39
Number of pages27
JournalPsychometrika
Volume76
Issue number1
DOIs
StatePublished - Jan 2011

Keywords

  • Fisher information
  • Kullback-Leibler information
  • Multi-dimensional adaptive testing

ASJC Scopus subject areas

  • Psychology(all)
  • Applied Mathematics

Fingerprint Dive into the research topics of 'Kullback-Leibler Information and Its Applications in Multi-Dimensional Adaptive Testing'. Together they form a unique fingerprint.

Cite this