Published in

Society of Photo-optical Instrumentation Engineers, Optical Engineering, 4(37), p. 1305, 1998

DOI: 10.1117/1.601963

Links

Tools

Export citation

Search in Google Scholar

Discrete All-Positive Multilayer Perceptrons for Optical Implementation

Journal article published in 1998 by Martigny Valais Suisse, P. D. Moerland ORCID, Indu Saxena, Emile Fiesler
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Red circle
Preprint: archiving forbidden
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

. All-optical multilayer perceptrons differ in various ways from the ideal neural network model. Examples are the use of non-ideal activation functions which are truncated, asymmetric, and have a non-standard gain, restriction of the network parameters to non-negative values, and the limited accuracy of the weights. In this paper, a backpropagation-based learning rule is presented that compensates for these non-idealities and enables the implementation of all-optical multilayer perceptrons where learning occurs under control of a computer. The good performance of this learning rule, even when using a small number of weight levels, is illustrated by a series of computer simulations incorporating the non-idealities. 2 IDIAP--RR 97-02 1 Introduction An important feature of multilayer perceptrons (MLPs) is their massive parallelism of weighted interconnections between layers of non-linear processing elements. Conventional digital computers, however, cannot take advantage of the parallel...