Universal Source Coding of Deep Neural Networks

Sourya Basu, Lav R. Varshney

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Deep neural networks have shown incredible performance for inference tasks in a variety of domains. Unfortunately, most current deep networks are enormous cloud-based structures that require significant storage space, which limits scaling of deep learning as a service (DLaaS). This paper is concerned with finding universal lossless compressed representations of deep feedforward networkswith synaptic weights drawn from discrete sets. The basic insight that allows much less rate than naive approaches is the recognition that the bipartite graph layers of feedforward networks have a kind of permutation invariance to the labeling of nodes, in terms of inferential operation. We provide efficient algorithms to dissipate this irrelevant uncertainty and then use arithmetic coding to nearly achieve the entropy bound in a universal manner.

Original languageEnglish (US)
Title of host publicationProceedings - DCC 2017, 2017 Data Compression Conference
EditorsAli Bilgin, Joan Serra-Sagrista, Michael W. Marcellin, James A. Storer
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages310-319
Number of pages10
ISBN (Electronic)9781509067213
DOIs
StatePublished - May 8 2017
Externally publishedYes
Event2017 Data Compression Conference, DCC 2017 - Snowbird, United States
Duration: Apr 4 2017Apr 7 2017

Publication series

NameData Compression Conference Proceedings
VolumePart F127767
ISSN (Print)1068-0314

Other

Other2017 Data Compression Conference, DCC 2017
Country/TerritoryUnited States
CitySnowbird
Period4/4/174/7/17

ASJC Scopus subject areas

  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Universal Source Coding of Deep Neural Networks'. Together they form a unique fingerprint.

Cite this