TY - JOUR
T1 - DEEPPOLAR
T2 - 41st International Conference on Machine Learning, ICML 2024
AU - Hebbar, S. Ashwin
AU - Ankireddy, Sravan Kumar
AU - Kim, Hyeji
AU - Oh, Sewoong
AU - Viswanath, Pramod
N1 - This research is supported in part by NSF CCF 2312753, ONR N00014-21-1-2388, NSF CNS 2112471, ARO W911NF2310062, ONR N00014-21-1-2379, NSF CNS-2008824, a gift from Intel, and Samsung Research America through 6G@UT center within the Wireless Networking and Communications Group (WNCG) at the University of Texas at Austin. We thank P. Trifonov for his insightful comments.
PY - 2024
Y1 - 2024
N2 - Progress in designing channel codes has been driven by human ingenuity and, fittingly, has been sporadic. Polar codes, developed on the foundation of Arikan's polarization kernel, represent the latest breakthrough in coding theory and have emerged as the state-of-the-art error-correction code for short-to-medium block length regimes. In an effort to automate the invention of good channel codes, especially in this regime, we explore a novel, non-linear generalization of Polar codes, which we call DEEPPOLAR codes. DEEPPOLAR codes extend the conventional Polar coding framework by utilizing a larger kernel size and parameterizing these kernels and matched decoders through neural networks. Our results demonstrate that these data-driven codes effectively leverage the benefits of a larger kernel size, resulting in enhanced reliability when compared to both existing neural codes and conventional Polar codes. Source code is available at this link.
AB - Progress in designing channel codes has been driven by human ingenuity and, fittingly, has been sporadic. Polar codes, developed on the foundation of Arikan's polarization kernel, represent the latest breakthrough in coding theory and have emerged as the state-of-the-art error-correction code for short-to-medium block length regimes. In an effort to automate the invention of good channel codes, especially in this regime, we explore a novel, non-linear generalization of Polar codes, which we call DEEPPOLAR codes. DEEPPOLAR codes extend the conventional Polar coding framework by utilizing a larger kernel size and parameterizing these kernels and matched decoders through neural networks. Our results demonstrate that these data-driven codes effectively leverage the benefits of a larger kernel size, resulting in enhanced reliability when compared to both existing neural codes and conventional Polar codes. Source code is available at this link.
UR - http://www.scopus.com/inward/record.url?scp=85203796053&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85203796053&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85203796053
SN - 2640-3498
VL - 235
SP - 18133
EP - 18154
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
Y2 - 21 July 2024 through 27 July 2024
ER -