Universal Source Coding of Deep Neural Networks

Sourya Basu, Lav R Varshney

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Deep neural networks have shown incredible performance for inference tasks in a variety of domains. Unfortunately, most current deep networks are enormous cloud-based structures that require significant storage space, which limits scaling of deep learning as a service (DLaaS). This paper is concerned with finding universal lossless compressed representations of deep feedforward networkswith synaptic weights drawn from discrete sets. The basic insight that allows much less rate than naive approaches is the recognition that the bipartite graph layers of feedforward networks have a kind of permutation invariance to the labeling of nodes, in terms of inferential operation. We provide efficient algorithms to dissipate this irrelevant uncertainty and then use arithmetic coding to nearly achieve the entropy bound in a universal manner.

Original languageEnglish (US)
Title of host publicationProceedings - DCC 2017, 2017 Data Compression Conference
EditorsAli Bilgin, Joan Serra-Sagrista, Michael W. Marcellin, James A. Storer
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages310-319
Number of pages10
ISBN (Electronic)9781509067213
DOIs
StatePublished - May 8 2017
Externally publishedYes
Event2017 Data Compression Conference, DCC 2017 - Snowbird, United States
Duration: Apr 4 2017Apr 7 2017

Publication series

NameData Compression Conference Proceedings
VolumePart F127767
ISSN (Print)1068-0314

Other

Other2017 Data Compression Conference, DCC 2017
CountryUnited States
CitySnowbird
Period4/4/174/7/17

Fingerprint

Invariance
Labeling
Entropy
Deep learning
Uncertainty
Deep neural networks

ASJC Scopus subject areas

  • Computer Networks and Communications

Cite this

Basu, S., & Varshney, L. R. (2017). Universal Source Coding of Deep Neural Networks. In A. Bilgin, J. Serra-Sagrista, M. W. Marcellin, & J. A. Storer (Eds.), Proceedings - DCC 2017, 2017 Data Compression Conference (pp. 310-319). [7923644] (Data Compression Conference Proceedings; Vol. Part F127767). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/DCC.2017.60

Universal Source Coding of Deep Neural Networks. / Basu, Sourya; Varshney, Lav R.

Proceedings - DCC 2017, 2017 Data Compression Conference. ed. / Ali Bilgin; Joan Serra-Sagrista; Michael W. Marcellin; James A. Storer. Institute of Electrical and Electronics Engineers Inc., 2017. p. 310-319 7923644 (Data Compression Conference Proceedings; Vol. Part F127767).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Basu, S & Varshney, LR 2017, Universal Source Coding of Deep Neural Networks. in A Bilgin, J Serra-Sagrista, MW Marcellin & JA Storer (eds), Proceedings - DCC 2017, 2017 Data Compression Conference., 7923644, Data Compression Conference Proceedings, vol. Part F127767, Institute of Electrical and Electronics Engineers Inc., pp. 310-319, 2017 Data Compression Conference, DCC 2017, Snowbird, United States, 4/4/17. https://doi.org/10.1109/DCC.2017.60
Basu S, Varshney LR. Universal Source Coding of Deep Neural Networks. In Bilgin A, Serra-Sagrista J, Marcellin MW, Storer JA, editors, Proceedings - DCC 2017, 2017 Data Compression Conference. Institute of Electrical and Electronics Engineers Inc. 2017. p. 310-319. 7923644. (Data Compression Conference Proceedings). https://doi.org/10.1109/DCC.2017.60
Basu, Sourya ; Varshney, Lav R. / Universal Source Coding of Deep Neural Networks. Proceedings - DCC 2017, 2017 Data Compression Conference. editor / Ali Bilgin ; Joan Serra-Sagrista ; Michael W. Marcellin ; James A. Storer. Institute of Electrical and Electronics Engineers Inc., 2017. pp. 310-319 (Data Compression Conference Proceedings).
@inproceedings{fbe9b5f62d1c4193ab6184b18603858f,
title = "Universal Source Coding of Deep Neural Networks",
abstract = "Deep neural networks have shown incredible performance for inference tasks in a variety of domains. Unfortunately, most current deep networks are enormous cloud-based structures that require significant storage space, which limits scaling of deep learning as a service (DLaaS). This paper is concerned with finding universal lossless compressed representations of deep feedforward networkswith synaptic weights drawn from discrete sets. The basic insight that allows much less rate than naive approaches is the recognition that the bipartite graph layers of feedforward networks have a kind of permutation invariance to the labeling of nodes, in terms of inferential operation. We provide efficient algorithms to dissipate this irrelevant uncertainty and then use arithmetic coding to nearly achieve the entropy bound in a universal manner.",
author = "Sourya Basu and Varshney, {Lav R}",
year = "2017",
month = "5",
day = "8",
doi = "10.1109/DCC.2017.60",
language = "English (US)",
series = "Data Compression Conference Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "310--319",
editor = "Ali Bilgin and Joan Serra-Sagrista and Marcellin, {Michael W.} and Storer, {James A.}",
booktitle = "Proceedings - DCC 2017, 2017 Data Compression Conference",
address = "United States",

}

TY - GEN

T1 - Universal Source Coding of Deep Neural Networks

AU - Basu, Sourya

AU - Varshney, Lav R

PY - 2017/5/8

Y1 - 2017/5/8

N2 - Deep neural networks have shown incredible performance for inference tasks in a variety of domains. Unfortunately, most current deep networks are enormous cloud-based structures that require significant storage space, which limits scaling of deep learning as a service (DLaaS). This paper is concerned with finding universal lossless compressed representations of deep feedforward networkswith synaptic weights drawn from discrete sets. The basic insight that allows much less rate than naive approaches is the recognition that the bipartite graph layers of feedforward networks have a kind of permutation invariance to the labeling of nodes, in terms of inferential operation. We provide efficient algorithms to dissipate this irrelevant uncertainty and then use arithmetic coding to nearly achieve the entropy bound in a universal manner.

AB - Deep neural networks have shown incredible performance for inference tasks in a variety of domains. Unfortunately, most current deep networks are enormous cloud-based structures that require significant storage space, which limits scaling of deep learning as a service (DLaaS). This paper is concerned with finding universal lossless compressed representations of deep feedforward networkswith synaptic weights drawn from discrete sets. The basic insight that allows much less rate than naive approaches is the recognition that the bipartite graph layers of feedforward networks have a kind of permutation invariance to the labeling of nodes, in terms of inferential operation. We provide efficient algorithms to dissipate this irrelevant uncertainty and then use arithmetic coding to nearly achieve the entropy bound in a universal manner.

UR - http://www.scopus.com/inward/record.url?scp=85020056547&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85020056547&partnerID=8YFLogxK

U2 - 10.1109/DCC.2017.60

DO - 10.1109/DCC.2017.60

M3 - Conference contribution

AN - SCOPUS:85020056547

T3 - Data Compression Conference Proceedings

SP - 310

EP - 319

BT - Proceedings - DCC 2017, 2017 Data Compression Conference

A2 - Bilgin, Ali

A2 - Serra-Sagrista, Joan

A2 - Marcellin, Michael W.

A2 - Storer, James A.

PB - Institute of Electrical and Electronics Engineers Inc.

ER -