Information Theory and Its Application to Testing

Hua Hua Chang, Chun Wang, Zhiliang Ying

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Information was first introduced by R.A. Fisher (1925) in a mathematically precise formulation for his statistical estimation theory (Kullback, 1959). For an observation Y assumed to follow density f (·; θ) parametrized by θ, the Fisher information is defined by I (θ) = Eθ [∂ log f (Y; θ) ∂θ]2, where Eθ denotes the expectation under density f (·; θ). Fisher showed that the asymptotic variance of the maximum-likelihood estimator (MLE) is the inverse of I (θ), which also serves as the lower bound to the variance of any other unbiased estimator. In other words, the MLE is optimal in terms of minimizing variance. An alternative expression for the Fisher information is I (θ) = -Eθ [∂2 log f (Y; θ)] giving its geometric interpretation as the expected curvature of the log-likelihood function. The Fisher information is a nonnegative number when θ is a scalar and it is a nonnegative definite matrix when θ is a vector. It provides a lower bound, also known as the CramerRao inequality, to the variance (or covariance matrix in the multidimensional case) of any unbiased estimator of θ.

Original languageEnglish (US)
Title of host publicationHandbook of Item Response Theory
Subtitle of host publicationStatistical Tools
EditorsWim J van der Linden
PublisherCRC Press
Pages105-124
Number of pages20
Volume2
ISBN (Electronic)9781315362816, 9781315119144
ISBN (Print)9781466514324
DOIs
StatePublished - 2016

ASJC Scopus subject areas

  • Mathematics(all)

Fingerprint Dive into the research topics of 'Information Theory and Its Application to Testing'. Together they form a unique fingerprint.

Cite this