An extremal inequality motivated by multiterminal information-theoretic problems

Research output: Contribution to journalArticle

Abstract

We prove a new extremal inequality, motivated by the vector Gaussian broadcast channel and the distributed source coding with a single quadratic distortion constraint problems. As a corollary, this inequality yields a generalization of the classical entropy-power inequality (EPI). As another corollary, this inequality sheds insight into maximizing the differential entropy of the sum of two dependent random variables.

Original languageEnglish (US)
Pages (from-to)1839-1851
Number of pages13
JournalIEEE Transactions on Information Theory
Volume53
Issue number5
DOIs
StatePublished - May 1 2007

Fingerprint

Entropy
entropy
Random variables
broadcast
coding

Keywords

  • Differential entropy
  • Distributed source coding
  • Entropy- power inequality (EPI)
  • Fisher information
  • Vector Gaussian broadcast channel

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Cite this

An extremal inequality motivated by multiterminal information-theoretic problems. / Liu, Tie; Viswanath, Pramod.

In: IEEE Transactions on Information Theory, Vol. 53, No. 5, 01.05.2007, p. 1839-1851.

Research output: Contribution to journalArticle

@article{034d8d5b942d44bcbc4473e05546069b,
title = "An extremal inequality motivated by multiterminal information-theoretic problems",
abstract = "We prove a new extremal inequality, motivated by the vector Gaussian broadcast channel and the distributed source coding with a single quadratic distortion constraint problems. As a corollary, this inequality yields a generalization of the classical entropy-power inequality (EPI). As another corollary, this inequality sheds insight into maximizing the differential entropy of the sum of two dependent random variables.",
keywords = "Differential entropy, Distributed source coding, Entropy- power inequality (EPI), Fisher information, Vector Gaussian broadcast channel",
author = "Tie Liu and Pramod Viswanath",
year = "2007",
month = "5",
day = "1",
doi = "10.1109/TIT.2007.894680",
language = "English (US)",
volume = "53",
pages = "1839--1851",
journal = "IEEE Transactions on Information Theory",
issn = "0018-9448",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "5",

}

TY - JOUR

T1 - An extremal inequality motivated by multiterminal information-theoretic problems

AU - Liu, Tie

AU - Viswanath, Pramod

PY - 2007/5/1

Y1 - 2007/5/1

N2 - We prove a new extremal inequality, motivated by the vector Gaussian broadcast channel and the distributed source coding with a single quadratic distortion constraint problems. As a corollary, this inequality yields a generalization of the classical entropy-power inequality (EPI). As another corollary, this inequality sheds insight into maximizing the differential entropy of the sum of two dependent random variables.

AB - We prove a new extremal inequality, motivated by the vector Gaussian broadcast channel and the distributed source coding with a single quadratic distortion constraint problems. As a corollary, this inequality yields a generalization of the classical entropy-power inequality (EPI). As another corollary, this inequality sheds insight into maximizing the differential entropy of the sum of two dependent random variables.

KW - Differential entropy

KW - Distributed source coding

KW - Entropy- power inequality (EPI)

KW - Fisher information

KW - Vector Gaussian broadcast channel

UR - http://www.scopus.com/inward/record.url?scp=34248373869&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=34248373869&partnerID=8YFLogxK

U2 - 10.1109/TIT.2007.894680

DO - 10.1109/TIT.2007.894680

M3 - Article

AN - SCOPUS:34248373869

VL - 53

SP - 1839

EP - 1851

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

SN - 0018-9448

IS - 5

ER -