Abstract
We present a general information exponential inequality that measures the statistical complexity of some deterministic and randomized density estimators. Using this inequality, we are able to improve classical results concerning the convergence of two-part code MDL in [1], Moreover, we are able to derive clean finite-sample convergence bounds that are not obtainable using previous approaches.
Original language | English (US) |
---|---|
Pages (from-to) | 315-330 |
Number of pages | 16 |
Journal | Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) |
Volume | 3120 |
DOIs | |
State | Published - 2004 |
Externally published | Yes |
Event | 17th Annual Conference on Learning Theory, COLT 2004 - Banff, Canada Duration: Jul 1 2004 → Jul 4 2004 |
ASJC Scopus subject areas
- Theoretical Computer Science
- General Computer Science