Network Compression: Worst Case Analysis

Himanshu Asnani, Ilan Shomorony, A. Salman Avestimehr, Tsachy Weissman

Research output: Contribution to journalArticlepeer-review


We study the problem of communicating a distributed correlated memoryless source over a memoryless network, from source nodes to destination nodes, under quadratic distortion constraints. We establish the following two complementary results: 1) for an arbitrary memoryless network, among all distributed memoryless sources of a given correlation, Gaussian sources are least compressible, that is, they admit the smallest set of achievable distortion tuples and 2) for any memoryless source to be communicated over a memoryless additive-noise network, among all noise processes of a given correlation, Gaussian noise admits the smallest achievable set of distortion tuples. We establish these results constructively by showing how schemes for the corresponding Gaussian problems can be applied to achieve similar performance for (source or noise) distributions that are not necessarily Gaussian but have the same covariance.

Original languageEnglish (US)
Article number7122879
Pages (from-to)3980-3995
Number of pages16
JournalIEEE Transactions on Information Theory
Issue number7
StatePublished - Jul 1 2015
Externally publishedYes


  • Worst-case source
  • joint source-channel coding
  • network compression
  • worst-case noise

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


Dive into the research topics of 'Network Compression: Worst Case Analysis'. Together they form a unique fingerprint.

Cite this