On the convergence of MDL density estimation

Research output: Contribution to journalConference articlepeer-review

Abstract

We present a general information exponential inequality that measures the statistical complexity of some deterministic and randomized density estimators. Using this inequality, we are able to improve classical results concerning the convergence of two-part code MDL in [1], Moreover, we are able to derive clean finite-sample convergence bounds that are not obtainable using previous approaches.

Original languageEnglish (US)
Pages (from-to)315-330
Number of pages16
JournalLecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)
Volume3120
DOIs
StatePublished - 2004
Externally publishedYes
Event17th Annual Conference on Learning Theory, COLT 2004 - Banff, Canada
Duration: Jul 1 2004Jul 4 2004

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'On the convergence of MDL density estimation'. Together they form a unique fingerprint.

Cite this