Abstract
Shannon's analysis of the fundamental capacity limits for memoryless communication channels has been refined over time. In this paper, the maximum volume M∗avg(n, e ) of length- n codes subject to an average decoding error probability ϵ is shown to satisfy the following tight asymptotic lower and upper bounds as n to ∞ : underline {A} ϵ + o(1) ≤ log Mavg}^{∗}(n,ϵ ) - [nC - √nVϵ}\,Q-1}ϵ ) + (1/2) n] ≤ overline {A}ϵ + o(1) , where C is the Shannon capacity, Vϵ is the ϵ-channel dispersion, or second-order coding rate, Q is the tail probability of the normal distribution, and the constants {A}ϵ and overline {A}ϵ are explicitly identified. This expression holds under mild regularity assumptions on the channel, including nonsingularity. The gap A ϵ - A ϵ is one nat for weakly symmetric channels in the Cover-Thomas sense, and typically a few nats for other symmetric channels, for the binary symmetric channel, and for the Z channel. The derivation is based on strong large-deviations analysis and refined central limit asymptotics. A random coding scheme that achieves the lower bound is presented. The codewords are drawn from a capacity-achieving input distribution modified by an O(1/√n correction term.
Original language | English (US) |
---|---|
Article number | 7803558 |
Pages (from-to) | 2278-2313 |
Number of pages | 36 |
Journal | IEEE Transactions on Information Theory |
Volume | 63 |
Issue number | 4 |
DOIs | |
State | Published - Apr 2017 |
Keywords
- Edgeworth expansion
- Fisher information
- Neyman-Pearson testing
- Shannon theory
- Z channel
- binary symmetric channel
- capacity
- exponentially tilted distributions
- large deviations
- local limit theorem
- random codes
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences