Abstract
Capacity formulas and random-coding exponents are derived for a generalized family of Gel'fand-Pinsker coding problems. These exponents yield asymptotic upper bounds on the achievable log probability of error. In our model, information is to be reliably transmitted through a noisy channel with finite input and output alphabets and random state sequence, and the channel is selected by a hypothetical adversary. Partial information about the state sequence is available to the encoder, adversary, and decoder. The design of the transmitter is subject to a cost constraint. Two families of channels are considered: 1) compound discrete memoryless channels (CDMC), and 2) channels with arbitrary memory, subject to an additive cost constraint, or more generally, to a hard constraint on the conditional type of the channel output given the input. Both problems are closely connected. The random-coding exponent is achieved using a stacked binning scheme and a maximum penalized mutual information decoder, which may be thought of as an empirical generalized maximum a posteriori decoder. For channels with arbitrary memory, the random-coding exponents are larger than their CDMC counterparts. Applications of this study include watermarking, data hiding, communication in presence of partially known interferers, and problems such as broadcast channels, all of which involve the fundamental idea of binning.
Original language | English (US) |
---|---|
Pages (from-to) | 1326-1347 |
Number of pages | 22 |
Journal | IEEE Transactions on Information Theory |
Volume | 53 |
Issue number | 4 |
DOIs | |
State | Published - Apr 2007 |
Keywords
- Arbitrarily varying channels
- Broadcast channels
- Capacity
- Channel coding with side information
- Data hiding
- Error exponents
- Maximum a posteriori probability (MAP) decoding
- Method of types
- Random binning
- Randomized codes
- Reliability function
- Universal coding and decoding
- Watermarking
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences