TY - GEN
T1 - A 42pJ/decision 3.12TOPS/W robust in-memory machine learning classifier with on-chip training
AU - Gonugondla, Sujan Kumar
AU - Kang, Mingu
AU - Shanbhag, Naresh
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/3/8
Y1 - 2018/3/8
N2 - Embedded sensory systems (Fig. 31.2.1) continuously acquire and process data for inference and decision-making purposes under stringent energy constraints. These always-ON systems need to track changing data statistics and environmental conditions, such as temperature, with minimal energy consumption. Digital inference architectures [1,2] are not well-suited for such energy-constrained sensory systems due to their high energy consumption, which is dominated (>75%) by the energy cost of memory read accesses and digital computations. In-memory architectures [3,4] significantly reduce the energy cost by embedding pitch-matched analog computations in the periphery of the SRAM bitcell array (BCA). However, their analog nature combined with stringent area constraints makes these architectures susceptible to process, voltage, and temperature (PVT) variation. Previously, off-chip training [4] has been shown to be effective in compensating for PVT variations of in-memory architectures. However, PVT variations are die-specific and data statistics in always-ON sensory systems can change over time. Thus, on-chip training is critical to address both sources of variation and to enable the design of energy efficient always-ON sensory systems based on in-memory architectures. The stochastic gradient descent (SGD) algorithm is widely used to train machine learning algorithms such as support vector machines (SVMs), deep neural networks (DNNs) and others. This paper demonstrates the use of on-chip SGD-based training to compensate for PVT and data statistics variation to design a robust in-memory SVM classifier.
AB - Embedded sensory systems (Fig. 31.2.1) continuously acquire and process data for inference and decision-making purposes under stringent energy constraints. These always-ON systems need to track changing data statistics and environmental conditions, such as temperature, with minimal energy consumption. Digital inference architectures [1,2] are not well-suited for such energy-constrained sensory systems due to their high energy consumption, which is dominated (>75%) by the energy cost of memory read accesses and digital computations. In-memory architectures [3,4] significantly reduce the energy cost by embedding pitch-matched analog computations in the periphery of the SRAM bitcell array (BCA). However, their analog nature combined with stringent area constraints makes these architectures susceptible to process, voltage, and temperature (PVT) variation. Previously, off-chip training [4] has been shown to be effective in compensating for PVT variations of in-memory architectures. However, PVT variations are die-specific and data statistics in always-ON sensory systems can change over time. Thus, on-chip training is critical to address both sources of variation and to enable the design of energy efficient always-ON sensory systems based on in-memory architectures. The stochastic gradient descent (SGD) algorithm is widely used to train machine learning algorithms such as support vector machines (SVMs), deep neural networks (DNNs) and others. This paper demonstrates the use of on-chip SGD-based training to compensate for PVT and data statistics variation to design a robust in-memory SVM classifier.
UR - http://www.scopus.com/inward/record.url?scp=85046487520&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85046487520&partnerID=8YFLogxK
U2 - 10.1109/ISSCC.2018.8310398
DO - 10.1109/ISSCC.2018.8310398
M3 - Conference contribution
AN - SCOPUS:85046487520
T3 - Digest of Technical Papers - IEEE International Solid-State Circuits Conference
SP - 490
EP - 492
BT - 2018 IEEE International Solid-State Circuits Conference, ISSCC 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 65th IEEE International Solid-State Circuits Conference, ISSCC 2018
Y2 - 11 February 2018 through 15 February 2018
ER -