Abstract
Information was first introduced by R.A. Fisher (1925) in a mathematically precise formulation for his statistical estimation theory (Kullback, 1959). For an observation Y assumed to follow density f (·; θ) parametrized by θ, the Fisher information is defined by I (θ) = Eθ [∂ log f (Y; θ) ∂θ]2, where Eθ denotes the expectation under density f (·; θ). Fisher showed that the asymptotic variance of the maximum-likelihood estimator (MLE) is the inverse of I (θ), which also serves as the lower bound to the variance of any other unbiased estimator. In other words, the MLE is optimal in terms of minimizing variance. An alternative expression for the Fisher information is I (θ) = -Eθ [∂2 log f (Y; θ)] giving its geometric interpretation as the expected curvature of the log-likelihood function. The Fisher information is a nonnegative number when θ is a scalar and it is a nonnegative definite matrix when θ is a vector. It provides a lower bound, also known as the CramerRao inequality, to the variance (or covariance matrix in the multidimensional case) of any unbiased estimator of θ.
Original language | English (US) |
---|---|
Title of host publication | Handbook of Item Response Theory |
Subtitle of host publication | Statistical Tools |
Editors | Wim J van der Linden |
Publisher | CRC Press |
Pages | 105-123 |
Number of pages | 19 |
Volume | 2 |
ISBN (Electronic) | 9781315362816, 9781315119144 |
ISBN (Print) | 9781466514324, 9780367221201 |
DOIs | |
State | Published - 2016 |
ASJC Scopus subject areas
- General Mathematics