Analytical guarantees on numerical precision of deep neural networks

Charbel Sakr, Yongjune Kim, Naresh Shanbhag

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The acclaimed successes of neural networks often overshadow their tremendous complexity. We focus on numerical precision - A key parameter defining the complexity of neural networks. First, we present theoretical bounds on the accuracy in presence of limited precision. Interestingly, these bounds can be computed via the back-propagation algorithm. Hence, by combining our theoretical analysis and the backpropagation algorithm, we are able to readily determine the minimum precision needed to preserve accuracy without having to resort to timeconsuming fixed-point simulations. Wc provide numerical evidcncc showing how our approach allows us to maintain high accuracy but with lower complexity than state-of-the-art binary networks.

Original languageEnglish (US)
Title of host publication34th International Conference on Machine Learning, ICML 2017
PublisherInternational Machine Learning Society (IMLS)
Pages4603-4615
Number of pages13
ISBN (Electronic)9781510855144
StatePublished - 2017
Event34th International Conference on Machine Learning, ICML 2017 - Sydney, Australia
Duration: Aug 6 2017Aug 11 2017

Publication series

Name34th International Conference on Machine Learning, ICML 2017
Volume6

Other

Other34th International Conference on Machine Learning, ICML 2017
Country/TerritoryAustralia
CitySydney
Period8/6/178/11/17

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Human-Computer Interaction
  • Software

Fingerprint

Dive into the research topics of 'Analytical guarantees on numerical precision of deep neural networks'. Together they form a unique fingerprint.

Cite this