Sequential prediction with coded side information under logarithmic loss

Yanina Shkel, Maxim Raginsky, Sergio Verdú

Research output: Contribution to journalConference articlepeer-review

Abstract

We study the problem of sequential prediction with coded side information under logarithmic loss (log-loss). We show an operational equivalence between this setup and lossy compression with log-loss distortion. Using this insight, together with recent work on lossy compression with log-loss, we connect prediction strategies with distributions in a certain subset of the probability simplex. This allows us to derive a Shtarkov-like bound for regret and to evaluate the regret for several illustrative classes of experts. In the present work, we mainly focus on the “batch” side information setting with sequential prediction.

Original languageEnglish (US)
Pages (from-to)753-769
Number of pages17
JournalProceedings of Machine Learning Research
Volume83
StatePublished - 2018
Event29th International Conference on Algorithmic Learning Theory, ALT 2018 - Lanzarote, Spain
Duration: Apr 7 2018Apr 9 2018

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Sequential prediction with coded side information under logarithmic loss'. Together they form a unique fingerprint.

Cite this