Information-theoretic upper and lower bounds for statistical estimation

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we establish upper and lower bounds for some statistical estimation problems through concise information-theoretic arguments. Our upper bound analysis is based on a simple yet general inequality which we call the information exponential inequality. We show that this inequality naturally leads to a general randomized estimation method, for which performance upper bounds can be obtained. The lower bounds, applicable for all statistical estimators, are obtained by original applications of some well known information-theoretic inequalities, and approximately match the obtained upper bounds for various important problems. Moreover, our framework can be regarded as a natural generalization of the standard minimax framework, in that we allow the performance of the estimator to vary for different possible underlying distributions according to a predefined prior.

Original languageEnglish (US)
Pages (from-to)1307-1321
Number of pages15
JournalIEEE Transactions on Information Theory
Volume52
Issue number4
DOIs
StatePublished - Apr 2006
Externally publishedYes

Keywords

  • Gibbs algorithm
  • Lower bound
  • Minimax
  • PAC-Bayes
  • Randomized estimatin
  • Statistical estimation

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Cite this