Abstract

New device technologies such as spintronics, carbon nanotubes, and nanoscale CMOS incur random transient failures, where the failure probability is governed by the energy consumption through energy-failure functions. At the same time, there is growing use of deep neural networks for many inference applications, and specialized hardware is being developed with these nanotechnologies as physical substrates. It is important to understand the basic energy-reliability limits. Using Pippenger's mutual information propagation technique (extended to directed acyclic graphs), together with optimization, we obtain a lower bound on energy consumption in multilayer binary neural networks for a given reliability. We also obtain a simple energy allocation rule for neurons in the different layers of the neural network. The mathematical results also provide insight into mammalian neuroenergetics of brain regions involved in sensory processing.

Original languageEnglish (US)
Title of host publication2017 51st Annual Conference on Information Sciences and Systems, CISS 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781509047802
DOIs
StatePublished - May 10 2017
Event51st Annual Conference on Information Sciences and Systems, CISS 2017 - Baltimore, United States
Duration: Mar 22 2017Mar 24 2017

Publication series

Name2017 51st Annual Conference on Information Sciences and Systems, CISS 2017

Other

Other51st Annual Conference on Information Sciences and Systems, CISS 2017
Country/TerritoryUnited States
CityBaltimore
Period3/22/173/24/17

Keywords

  • Energy-failure function
  • Neural network
  • Reliability

ASJC Scopus subject areas

  • Signal Processing
  • Information Systems and Management
  • Computer Networks and Communications
  • Information Systems

Fingerprint

Dive into the research topics of 'Energy-reliability limits in nanoscale neural networks'. Together they form a unique fingerprint.

Cite this