LEARN Codes: Inventing Low-Latency Codes via Recurrent Neural Networks

Yihan Jiang, Hyeji Kim, Himanshu Asnani, Sreeram Kannan, Sewoong Oh, Pramod Viswanath

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Designing channel codes under low latency constraints is one of the most demanding requirements in 5G standards. However, sharp characterizations of the performances of traditional codes are only available in the large block lengths limit. Code designs are guided by those asymptotic analyses and require large block lengths and long latency to achieve the desired error rate. Furthermore, when the codes designed for one channel (e.g. Additive White Gaussian Noise (AWGN) channel) are used for another (e.g. non-AWGN channels), heuristics are necessary to achieve any non trivial performance thereby severely lacking in robustness as well as adaptivity. Obtained by jointly designing recurrent neural network (RNN) based encoder and decoder, we propose an end-to-end learned neural code which outperforms canonical convolutional code under block settings. With this gained experience of designing a novel neural block code, we propose a new class of codes under low latency constraint Low-latency Efficient Adaptive Robust Neural (LEARN) codes, which outperform the state-of-the-art low latency codes as well as exhibit robustness and adaptivity properties. LEARN codes show the potential of designing new versatile and universal codes for future communications via tools of modern deep learning coupled with communication engineering insights.

Original languageEnglish (US)
Title of host publication2019 IEEE International Conference on Communications, ICC 2019 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781538680889
DOIs
StatePublished - May 2019
Event2019 IEEE International Conference on Communications, ICC 2019 - Shanghai, China
Duration: May 20 2019May 24 2019

Publication series

NameIEEE International Conference on Communications
Volume2019-May
ISSN (Print)1550-3607

Conference

Conference2019 IEEE International Conference on Communications, ICC 2019
CountryChina
CityShanghai
Period5/20/195/24/19

Fingerprint

Recurrent neural networks
Convolutional codes
Block codes
Communication
Deep learning

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Electrical and Electronic Engineering

Cite this

Jiang, Y., Kim, H., Asnani, H., Kannan, S., Oh, S., & Viswanath, P. (2019). LEARN Codes: Inventing Low-Latency Codes via Recurrent Neural Networks. In 2019 IEEE International Conference on Communications, ICC 2019 - Proceedings [8761286] (IEEE International Conference on Communications; Vol. 2019-May). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICC.2019.8761286

LEARN Codes : Inventing Low-Latency Codes via Recurrent Neural Networks. / Jiang, Yihan; Kim, Hyeji; Asnani, Himanshu; Kannan, Sreeram; Oh, Sewoong; Viswanath, Pramod.

2019 IEEE International Conference on Communications, ICC 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2019. 8761286 (IEEE International Conference on Communications; Vol. 2019-May).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Jiang, Y, Kim, H, Asnani, H, Kannan, S, Oh, S & Viswanath, P 2019, LEARN Codes: Inventing Low-Latency Codes via Recurrent Neural Networks. in 2019 IEEE International Conference on Communications, ICC 2019 - Proceedings., 8761286, IEEE International Conference on Communications, vol. 2019-May, Institute of Electrical and Electronics Engineers Inc., 2019 IEEE International Conference on Communications, ICC 2019, Shanghai, China, 5/20/19. https://doi.org/10.1109/ICC.2019.8761286
Jiang Y, Kim H, Asnani H, Kannan S, Oh S, Viswanath P. LEARN Codes: Inventing Low-Latency Codes via Recurrent Neural Networks. In 2019 IEEE International Conference on Communications, ICC 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc. 2019. 8761286. (IEEE International Conference on Communications). https://doi.org/10.1109/ICC.2019.8761286
Jiang, Yihan ; Kim, Hyeji ; Asnani, Himanshu ; Kannan, Sreeram ; Oh, Sewoong ; Viswanath, Pramod. / LEARN Codes : Inventing Low-Latency Codes via Recurrent Neural Networks. 2019 IEEE International Conference on Communications, ICC 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2019. (IEEE International Conference on Communications).
@inproceedings{cad9e97ec8d148a4b73c5255d87fae0c,
title = "LEARN Codes: Inventing Low-Latency Codes via Recurrent Neural Networks",
abstract = "Designing channel codes under low latency constraints is one of the most demanding requirements in 5G standards. However, sharp characterizations of the performances of traditional codes are only available in the large block lengths limit. Code designs are guided by those asymptotic analyses and require large block lengths and long latency to achieve the desired error rate. Furthermore, when the codes designed for one channel (e.g. Additive White Gaussian Noise (AWGN) channel) are used for another (e.g. non-AWGN channels), heuristics are necessary to achieve any non trivial performance thereby severely lacking in robustness as well as adaptivity. Obtained by jointly designing recurrent neural network (RNN) based encoder and decoder, we propose an end-to-end learned neural code which outperforms canonical convolutional code under block settings. With this gained experience of designing a novel neural block code, we propose a new class of codes under low latency constraint Low-latency Efficient Adaptive Robust Neural (LEARN) codes, which outperform the state-of-the-art low latency codes as well as exhibit robustness and adaptivity properties. LEARN codes show the potential of designing new versatile and universal codes for future communications via tools of modern deep learning coupled with communication engineering insights.",
author = "Yihan Jiang and Hyeji Kim and Himanshu Asnani and Sreeram Kannan and Sewoong Oh and Pramod Viswanath",
year = "2019",
month = "5",
doi = "10.1109/ICC.2019.8761286",
language = "English (US)",
series = "IEEE International Conference on Communications",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
booktitle = "2019 IEEE International Conference on Communications, ICC 2019 - Proceedings",
address = "United States",

}

TY - GEN

T1 - LEARN Codes

T2 - Inventing Low-Latency Codes via Recurrent Neural Networks

AU - Jiang, Yihan

AU - Kim, Hyeji

AU - Asnani, Himanshu

AU - Kannan, Sreeram

AU - Oh, Sewoong

AU - Viswanath, Pramod

PY - 2019/5

Y1 - 2019/5

N2 - Designing channel codes under low latency constraints is one of the most demanding requirements in 5G standards. However, sharp characterizations of the performances of traditional codes are only available in the large block lengths limit. Code designs are guided by those asymptotic analyses and require large block lengths and long latency to achieve the desired error rate. Furthermore, when the codes designed for one channel (e.g. Additive White Gaussian Noise (AWGN) channel) are used for another (e.g. non-AWGN channels), heuristics are necessary to achieve any non trivial performance thereby severely lacking in robustness as well as adaptivity. Obtained by jointly designing recurrent neural network (RNN) based encoder and decoder, we propose an end-to-end learned neural code which outperforms canonical convolutional code under block settings. With this gained experience of designing a novel neural block code, we propose a new class of codes under low latency constraint Low-latency Efficient Adaptive Robust Neural (LEARN) codes, which outperform the state-of-the-art low latency codes as well as exhibit robustness and adaptivity properties. LEARN codes show the potential of designing new versatile and universal codes for future communications via tools of modern deep learning coupled with communication engineering insights.

AB - Designing channel codes under low latency constraints is one of the most demanding requirements in 5G standards. However, sharp characterizations of the performances of traditional codes are only available in the large block lengths limit. Code designs are guided by those asymptotic analyses and require large block lengths and long latency to achieve the desired error rate. Furthermore, when the codes designed for one channel (e.g. Additive White Gaussian Noise (AWGN) channel) are used for another (e.g. non-AWGN channels), heuristics are necessary to achieve any non trivial performance thereby severely lacking in robustness as well as adaptivity. Obtained by jointly designing recurrent neural network (RNN) based encoder and decoder, we propose an end-to-end learned neural code which outperforms canonical convolutional code under block settings. With this gained experience of designing a novel neural block code, we propose a new class of codes under low latency constraint Low-latency Efficient Adaptive Robust Neural (LEARN) codes, which outperform the state-of-the-art low latency codes as well as exhibit robustness and adaptivity properties. LEARN codes show the potential of designing new versatile and universal codes for future communications via tools of modern deep learning coupled with communication engineering insights.

UR - http://www.scopus.com/inward/record.url?scp=85070200458&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85070200458&partnerID=8YFLogxK

U2 - 10.1109/ICC.2019.8761286

DO - 10.1109/ICC.2019.8761286

M3 - Conference contribution

AN - SCOPUS:85070200458

T3 - IEEE International Conference on Communications

BT - 2019 IEEE International Conference on Communications, ICC 2019 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -