In-Memory Computing Architectures for Sparse Distributed Memory

Mingu Kang, Naresh R. Shanbhag

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents an energy-efficient and high-Throughput architecture for Sparse Distributed Memory (SDM)-a computational model of the human brain [1]. The proposed SDM architecture is based on the recently proposed in-memory computing kernel for machine learning applications called Compute Memory (CM) [2], [3]. CM achieves energy and throughput efficiencies by deeply embedding computation into the memory array. SDM-specific techniques such as hierarchical binary decision (HBD) are employed to reduce the delay and energy further. The CM-based SDM (CM-SDM) is a mixed-signal circuit, and hence circuit-Aware behavioral, energy, and delay models in a 65 nm CMOS process are developed in order to predict system performance of SDM architectures in the autoand hetero-Associative modes. The delay and energy models indicate that CM-SDM, in general, can achieve up to 25 × and 12 × delay and energy reduction, respectively, over conventional SDM. When classifying 16 ×16 binary images with high noise levels (input bad pixel ratios: 15%-25%) into nine classes, all SDM architectures are able to generate output bad pixel ratios (Bo) ≤ 2%. The CM-SDM exhibits negligible loss in accuracy, i.e., its Bo degradation is within 0.4% as compared to that of the conventional SDM.

Original languageEnglish (US)
Article number7489032
Pages (from-to)855-863
Number of pages9
JournalIEEE Transactions on Biomedical Circuits and Systems
Volume10
Issue number4
DOIs
StatePublished - Aug 2016

Keywords

  • Associative memory
  • Compute Memory
  • brain-inspired computing
  • machine learning
  • pattern recognition
  • sparse distributed memory

ASJC Scopus subject areas

  • Biomedical Engineering
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'In-Memory Computing Architectures for Sparse Distributed Memory'. Together they form a unique fingerprint.

Cite this