A fully analog memristor-based neural network with online gradient training

Eyal Rosenthal, Sergey Greshnikov, Daniel Soudry, Shahar Kvatinsky

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In recent years, Neural Networks (NNs) have become widely popular for the execution of different machine learning algorithms. Training an NN is computationally intensive since it requires numerous multiplications of matrices that represent synaptic weights. It is therefore appealing to build a hardware-based NN accelerator to gain parallelism and efficient computation. Recently, we have proposed a compact circuit of a non-volatile synaptic weight based on two CMOS transistors and a memristor. In this paper, we present a fully analog NN design based on our previously proposed synapse with a full design of the different layers and their supporting CMOS circuits. We show that the presented NN significantly reduces the area as compared to a CMOS-based NN, while executing online gradient training with similar accuracy and computational speed improvement as a software implementation.

Original languageEnglish
Title of host publication2016 IEEE International Symposium on Circuits and Systems, ISCAS 2016
Pages1394-1397
Number of pages4
DOIs
StatePublished - 29 Jul 2016
Event2016 IEEE International Symposium on Circuits and Systems, ISCAS 2016 - Montreal, Canada
Duration: 22 May 201625 May 2016

Conference

Conference2016 IEEE International Symposium on Circuits and Systems, ISCAS 2016
Country/TerritoryCanada
CityMontreal
Period22/05/1625/05/16

Keywords

  • CMOS
  • Multilayer Neural Networks
  • RRAM
  • backpropagation
  • machine learning
  • memristor
  • neuromorphic

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'A fully analog memristor-based neural network with online gradient training'. Together they form a unique fingerprint.

Cite this