Abstract
If N is standard Gaussian, the minimum mean-square error (MMSE) of estimating a random variable X based on √snrX + N vanishes at least as fast as 1/snr as snr → ∞. We define the MMSE dimension of X as the limit as snr → ∞ of the product of snr and the MMSE. MMSE dimension is also shown to be the asymptotic ratio of nonlinear MMSE to linear MMSE. For discrete, absolutely continuous or mixed distribution we show that MMSE dimension equals Rnyi's information dimension. However, for a class of self-similar singular X (e.g., Cantor distribution), we show that the product of snr and MMSE oscillates around information dimension periodically in snr (dB). We also show that these results extend considerably beyond Gaussian noise under various technical conditions.
Original language | English (US) |
---|---|
Article number | 5961853 |
Pages (from-to) | 4857-4879 |
Number of pages | 23 |
Journal | IEEE Transactions on Information Theory |
Volume | 57 |
Issue number | 8 |
DOIs | |
State | Published - Aug 2011 |
Externally published | Yes |
Keywords
- Additive noise
- Bayesian statistics
- Gaussian noise
- Rnyi information dimension
- high-SNR asymptotics
- minimum mean-square error (MMSE)
- mutual information
- non-Gaussian noise
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences