DeepCode: Feedback codes via deep learning

Hyeji Kim, Yihan Jiang, Sreeram Kannan, Sewoong Oh, Pramod Viswanath

Research output: Contribution to journalConference article

Abstract

The design of codes for communicating reliably over a statistically well defined channel is an important endeavor involving deep mathematical research and wide-ranging practical applications. In this work, we present the first family of codes obtained via deep learning, which significantly beats state-of-the-art codes designed over several decades of research. The communication channel under consideration is the Gaussian noise channel with feedback, whose study was initiated by Shannon; feedback is known theoretically to improve reliability of communication, but no practical codes that do so have ever been successfully constructed. We break this logjam by integrating information theoretic insights harmoniously with recurrent-neural-network based encoders and decoders to create novel codes that outperform known codes by 3 orders of magnitude in reliability. We also demonstrate several desirable properties in the codes: (a) generalization to larger block lengths; (b) composability with known codes; (c) adaptation to practical constraints. This result also presents broader ramifications to coding theory: even when the channel has a clear mathematical model, deep learning methodologies, when combined with channel-specific information-theoretic insights, can potentially beat state-of-the-art codes, constructed over decades of mathematical research.

Original languageEnglish (US)
Pages (from-to)9436-9446
Number of pages11
JournalAdvances in Neural Information Processing Systems
Volume2018-December
StatePublished - Jan 1 2018
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: Dec 2 2018Dec 8 2018

Fingerprint

Feedback
Recurrent neural networks
Mathematical models
Communication
Deep learning

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Kim, H., Jiang, Y., Kannan, S., Oh, S., & Viswanath, P. (2018). DeepCode: Feedback codes via deep learning. Advances in Neural Information Processing Systems, 2018-December, 9436-9446.

DeepCode : Feedback codes via deep learning. / Kim, Hyeji; Jiang, Yihan; Kannan, Sreeram; Oh, Sewoong; Viswanath, Pramod.

In: Advances in Neural Information Processing Systems, Vol. 2018-December, 01.01.2018, p. 9436-9446.

Research output: Contribution to journalConference article

Kim, H, Jiang, Y, Kannan, S, Oh, S & Viswanath, P 2018, 'DeepCode: Feedback codes via deep learning', Advances in Neural Information Processing Systems, vol. 2018-December, pp. 9436-9446.
Kim H, Jiang Y, Kannan S, Oh S, Viswanath P. DeepCode: Feedback codes via deep learning. Advances in Neural Information Processing Systems. 2018 Jan 1;2018-December:9436-9446.
Kim, Hyeji ; Jiang, Yihan ; Kannan, Sreeram ; Oh, Sewoong ; Viswanath, Pramod. / DeepCode : Feedback codes via deep learning. In: Advances in Neural Information Processing Systems. 2018 ; Vol. 2018-December. pp. 9436-9446.
@article{569e01c9631c4c099bbc7c4c417abb25,
title = "DeepCode: Feedback codes via deep learning",
abstract = "The design of codes for communicating reliably over a statistically well defined channel is an important endeavor involving deep mathematical research and wide-ranging practical applications. In this work, we present the first family of codes obtained via deep learning, which significantly beats state-of-the-art codes designed over several decades of research. The communication channel under consideration is the Gaussian noise channel with feedback, whose study was initiated by Shannon; feedback is known theoretically to improve reliability of communication, but no practical codes that do so have ever been successfully constructed. We break this logjam by integrating information theoretic insights harmoniously with recurrent-neural-network based encoders and decoders to create novel codes that outperform known codes by 3 orders of magnitude in reliability. We also demonstrate several desirable properties in the codes: (a) generalization to larger block lengths; (b) composability with known codes; (c) adaptation to practical constraints. This result also presents broader ramifications to coding theory: even when the channel has a clear mathematical model, deep learning methodologies, when combined with channel-specific information-theoretic insights, can potentially beat state-of-the-art codes, constructed over decades of mathematical research.",
author = "Hyeji Kim and Yihan Jiang and Sreeram Kannan and Sewoong Oh and Pramod Viswanath",
year = "2018",
month = "1",
day = "1",
language = "English (US)",
volume = "2018-December",
pages = "9436--9446",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

TY - JOUR

T1 - DeepCode

T2 - Feedback codes via deep learning

AU - Kim, Hyeji

AU - Jiang, Yihan

AU - Kannan, Sreeram

AU - Oh, Sewoong

AU - Viswanath, Pramod

PY - 2018/1/1

Y1 - 2018/1/1

N2 - The design of codes for communicating reliably over a statistically well defined channel is an important endeavor involving deep mathematical research and wide-ranging practical applications. In this work, we present the first family of codes obtained via deep learning, which significantly beats state-of-the-art codes designed over several decades of research. The communication channel under consideration is the Gaussian noise channel with feedback, whose study was initiated by Shannon; feedback is known theoretically to improve reliability of communication, but no practical codes that do so have ever been successfully constructed. We break this logjam by integrating information theoretic insights harmoniously with recurrent-neural-network based encoders and decoders to create novel codes that outperform known codes by 3 orders of magnitude in reliability. We also demonstrate several desirable properties in the codes: (a) generalization to larger block lengths; (b) composability with known codes; (c) adaptation to practical constraints. This result also presents broader ramifications to coding theory: even when the channel has a clear mathematical model, deep learning methodologies, when combined with channel-specific information-theoretic insights, can potentially beat state-of-the-art codes, constructed over decades of mathematical research.

AB - The design of codes for communicating reliably over a statistically well defined channel is an important endeavor involving deep mathematical research and wide-ranging practical applications. In this work, we present the first family of codes obtained via deep learning, which significantly beats state-of-the-art codes designed over several decades of research. The communication channel under consideration is the Gaussian noise channel with feedback, whose study was initiated by Shannon; feedback is known theoretically to improve reliability of communication, but no practical codes that do so have ever been successfully constructed. We break this logjam by integrating information theoretic insights harmoniously with recurrent-neural-network based encoders and decoders to create novel codes that outperform known codes by 3 orders of magnitude in reliability. We also demonstrate several desirable properties in the codes: (a) generalization to larger block lengths; (b) composability with known codes; (c) adaptation to practical constraints. This result also presents broader ramifications to coding theory: even when the channel has a clear mathematical model, deep learning methodologies, when combined with channel-specific information-theoretic insights, can potentially beat state-of-the-art codes, constructed over decades of mathematical research.

UR - http://www.scopus.com/inward/record.url?scp=85064808608&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85064808608&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85064808608

VL - 2018-December

SP - 9436

EP - 9446

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -