Converses for distributed estimation via strong data processing inequalities

Aolin Xu, Maxim Raginsky

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We consider the problem of distributed estimation, where local processors observe independent samples conditioned on a common random parameter of interest, map the observations to a finite number of bits, and send these bits to a remote estimator over independent noisy channels. We derive converse results for this problem, such as lower bounds on Bayes risk. The main technical tools include a lower bound on the Bayes risk via mutual information and small ball probability, as well as strong data processing inequalities for the relative entropy. Our results can recover and improve some existing results on distributed estimation with noiseless channels, and also capture the effect of noisy channels on the estimation performance.

Original languageEnglish (US)
Title of host publicationProceedings - 2015 IEEE International Symposium on Information Theory, ISIT 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2376-2380
Number of pages5
ISBN (Electronic)9781467377041
DOIs
StatePublished - Sep 28 2015
EventIEEE International Symposium on Information Theory, ISIT 2015 - Hong Kong, Hong Kong
Duration: Jun 14 2015Jun 19 2015

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2015-June
ISSN (Print)2157-8095

Other

OtherIEEE International Symposium on Information Theory, ISIT 2015
CountryHong Kong
CityHong Kong
Period6/14/156/19/15

Keywords

  • Bayes risk
  • Distributed estimation
  • strong data processing inequalities

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint Dive into the research topics of 'Converses for distributed estimation via strong data processing inequalities'. Together they form a unique fingerprint.

Cite this