Published in

Wiley Open Access, Advanced Intelligent Systems, 1(6), 2023

DOI: 10.1002/aisy.202300399

Links

Tools

Export citation

Search in Google Scholar

Binary‐Stochasticity‐Enabled Highly Efficient Neuromorphic Deep Learning Achieves Better‐than‐Software Accuracy

This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

In this work, the requirement of using high‐precision (HP) signals is lifted and the circuits for implementing deep learning algorithms in memristor‐based hardware are simplified. The use of HP signals is required by the backpropagation learning algorithm since the gradient descent learning rule relies on the chain product of partial derivatives. However, it is both challenging and biologically implausible to implement such an HP algorithm in noisy and analog memristor‐based hardware systems. Herein, it is demonstrated that the requirement for HP signals handling is not necessary and more efficient deep learning can be achieved when using a binary stochastic learning algorithm. The new algorithm proposed in this work modifies elementary neural network operations, which improves energy efficiency by two orders of magnitude compared to traditional memristor‐based hardware and three orders of magnitude compared to complementary metal–oxide–semiconductor‐based hardware. It also provides better accuracy in pattern recognition tasks than the HP learning algorithm benchmarks.