CMOS-compatible electrochemical synaptic transistor arrays for deep learning accelerators

Jinsong Cui, Fufei An, Jiangchao Qian, Yuxuan Wu, Luke L. Sloan, Saran Pidaparthy, Jian Min Zuo, Qing Cao

Research output: Contribution to journalArticlepeer-review

Abstract

In-memory computing architectures based on memristive crossbar arrays could offer higher computing efficiency than traditional hardware in deep learning applications. However, the core memory devices must be capable of performing high-speed and symmetric analogue programming with small variability. They should also be compatible with silicon technology and scalable to nanometre-sized footprints. Here we report an electrochemical synaptic transistor that operates by shuffling protons between a hydrogenated tungsten oxide channel and gate through a zirconium dioxide protonic electrolyte. These devices offer multistate and symmetric programming of channel conductance via gate-voltage pulse control and small cycle-to-cycle variation. They can be programmed at frequencies approaching the megahertz range and exhibit endurances of over 100 million read–write cycles. They are also compatible with complementary metal–oxide–semiconductor technology and can be scaled to lateral dimensions of 150 × 150 nm2. Through monolithic integration with silicon transistors, we show that pseudo-crossbar arrays can be created for area- and energy-efficient deep learning accelerator applications.

Original languageEnglish (US)
Pages (from-to)292-300
Number of pages9
JournalNature Electronics
Volume6
Issue number4
DOIs
StatePublished - Apr 2023

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Instrumentation
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'CMOS-compatible electrochemical synaptic transistor arrays for deep learning accelerators'. Together they form a unique fingerprint.

Cite this