Note on mutual information and orthogonal space-time codes

Guy Bresler, Bruce Hajek

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Bit-error probability and mutual information rate have both been used as performance criteria for space-time codes for wireless communication. We use mutual information as the performance criterion because it determines the possible rate of communication when using an outer code. In this context, linear dispersion codes, first proposed by Hassibi and Hochwald, are appealing because of the high mutual information they provide, as well as their simplicity. Because complexity increases with the number of symbols, it may be sensible in some settings to fix the number of symbols sent per data bit. In the dissertation of Y. Jiang, it was conjectured that among linear dispersion codes with independent, binary symbols, orthogonal space-time codes are optimal in the following sense: they maximize mutual information subject to an average power constraint on each symbol. We prove the conjecture for a fixed number of real symbols with arbitrary distributions.

Original languageEnglish (US)
Title of host publicationProceedings - 2006 IEEE International Symposium on Information Theory, ISIT 2006
Pages1315-1318
Number of pages4
DOIs
StatePublished - 2006
Event2006 IEEE International Symposium on Information Theory, ISIT 2006 - Seattle, WA, United States
Duration: Jul 9 2006Jul 14 2006

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
ISSN (Print)2157-8101

Other

Other2006 IEEE International Symposium on Information Theory, ISIT 2006
Country/TerritoryUnited States
CitySeattle, WA
Period7/9/067/14/06

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Applied Mathematics
  • Modeling and Simulation

Fingerprint

Dive into the research topics of 'Note on mutual information and orthogonal space-time codes'. Together they form a unique fingerprint.

Cite this