Published in

Institute of Electrical and Electronics Engineers, IEEE Transactions on Neural Networks and Learning Systems, 2(26), p. 388-393, 2015

DOI: 10.1109/tnnls.2014.2311855

Links

Tools

Export citation

Search in Google Scholar

Delay-Based Reservoir Computing: Noise Effects in a Combined Analog and Digital Implementation

This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

Reservoir computing is a paradigm in machine learning whose processing capabilities rely on the dynamical behavior of recurrent neural networks. We present a mixed analog and digital implementation of this concept with a nonlinear analog electronic circuit as a main computational unit. In our approach, the reservoir network can be replaced by a single nonlinear element with delay via time-multiplexing. We analyze the influence of noise on the performance of the system for two benchmark tasks: 1) a classification problem and 2) a chaotic time-series prediction task. Special attention is given to the role of quantization noise, which is studied by varying the resolution in the conversion interface between the analog and digital worlds. ; This work was supported in part by MINECO, Spain, in part by the Comunitat Autònoma de les Illes Balears, in part by FEDER, in part by the European Commission under Project TEC2012-38864 and Project TEC2012-36335, in part by Grups Competitius, in part by the EC FP7 Project PHOCUS under Grant 240763, in part by the Interuniversity Attraction Pole Photonics@be, Belgian Science Policy Office, and in part by the Flemish Research Foundation. ; Peer Reviewed