Training of quantized deep neural networks using a magnetic tunnel junction-based synapse

Tzofnat Greenberg-Toledo, Ben Perach, Itay Hubara, Daniel Soudry, Shahar Kvatinsky

Research output: Contribution to journalArticlepeer-review

Abstract

Quantized neural networks (QNNs) are being actively researched as a solution for the computational complexity and memory intensity of deep neural networks. This has sparked efforts to develop algorithms that support both inference and training with quantized weight and activation values, without sacrificing accuracy. A recent example is the GXNOR framework for stochastic training of ternary and binary neural networks (TNNs and BNNs, respectively). In this paper, we show how magnetic tunnel junction (MTJ) devices can be used to support QNN training. We introduce a novel hardware synapse circuit that uses the MTJ stochastic behaviour to support the quantize update. The proposed circuit enables processing near memory (PNM) of QNN training, which subsequently reduces data movement. We simulated MTJ-based stochastic training of a TNN over the MNIST, SVHN, and CIFAR10 datasets and achieved an accuracy of (Equation presented), respectively (less than (Equation presented) degradation compared to the GXNOR algorithm). We evaluated the synapse array performance potential and showed that the proposed synapse circuit can train TNNs in situ, with (Equation presented) for feedforward and (Equation presented) for weight update.

Original languageEnglish
Article number114003
JournalSemiconductor Science and Technology
Volume36
Issue number11
DOIs
StatePublished - Nov 2021

All Science Journal Classification (ASJC) codes

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Electrical and Electronic Engineering
  • Materials Chemistry

Fingerprint

Dive into the research topics of 'Training of quantized deep neural networks using a magnetic tunnel junction-based synapse'. Together they form a unique fingerprint.

Cite this