TY - JOUR
T1 - On error exponents of modulo lattice additive noise channels
AU - Liu, Tie
AU - Moulin, Pierre
AU - Koetter, Ralf
N1 - Funding Information:
Manuscript received June 4, 2004; revised October 27, 2005. This work was supported by the National Science Foundation under ITR Grants CCR –0081268 and CCR–0325924. The material in this paper was presented in part at the IEEE Information Theory Workshop, San Antonio, TX, October 2004. The authors are with the Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801 USA (e-mail: [email protected]; [email protected]; [email protected]). Communicated by A. Lapidoth, Associate Editor for Shannon Theory. Digital Object Identifier 10.1109/TIT.2005.862077
PY - 2006/2
Y1 - 2006/2
N2 - Modulo lattice additive noise (MLAN) channels appear in the analysis of structured binning codes for Costa's dirty-paper channel and of nested lattice codes for the additive white Gaussian noise (AWGN) channel. In this paper, we derive a new lower bound on the error exponents of the MLAN channel. With a proper choice of the shaping lattice and the scaling parameter, the new lower bound coincides with the random-coding lower bound on the error exponents of the AWGN channel at the same signal-to-noise ratio (SNR) in the sphere-packing and straight-line regions. This result implies that, at least for rates close to channel capacity, 1) writing on dirty paper is as reliable as writing on clean paper; and 2) lattice encoding and decoding suffer no loss of error exponents relative to the optimal codes (with maximum-likelihood decoding) for the AWGN channel.
AB - Modulo lattice additive noise (MLAN) channels appear in the analysis of structured binning codes for Costa's dirty-paper channel and of nested lattice codes for the additive white Gaussian noise (AWGN) channel. In this paper, we derive a new lower bound on the error exponents of the MLAN channel. With a proper choice of the shaping lattice and the scaling parameter, the new lower bound coincides with the random-coding lower bound on the error exponents of the AWGN channel at the same signal-to-noise ratio (SNR) in the sphere-packing and straight-line regions. This result implies that, at least for rates close to channel capacity, 1) writing on dirty paper is as reliable as writing on clean paper; and 2) lattice encoding and decoding suffer no loss of error exponents relative to the optimal codes (with maximum-likelihood decoding) for the AWGN channel.
KW - Additive white Gaussian noise (AWGN) channel
KW - Costa's dirty-paper channel
KW - Error exponents
KW - Lattice decoding
KW - Modulo lattice additive noise (MLAN) channel
KW - Nested lattice codes
UR - http://www.scopus.com/inward/record.url?scp=31744433626&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=31744433626&partnerID=8YFLogxK
U2 - 10.1109/TIT.2005.862077
DO - 10.1109/TIT.2005.862077
M3 - Article
AN - SCOPUS:31744433626
SN - 0018-9448
VL - 52
SP - 454
EP - 471
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 2
ER -