Dissemin is shutting down on January 1st, 2025

Published in

2008 IEEE International Conference on Networking, Sensing and Control

DOI: 10.1109/icnsc.2008.4525331

Links

Tools

Export citation

Search in Google Scholar

Common Nature of Learning Exemplified by BP and Hopfield Neural Networks for Solving Online a System of Linear Equations

Proceedings article published in 2008 by Yunong Zhang, Zhan Li, Ke Chen, Binghuang Cai ORCID
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

Many computational problems widely encountered in scientific and engineering applications could finally be transformed to the online linear-equations solving. Classic numerical methods for solving linear equations include Gaussian elimination and matrix factorization methods, which are usually of O(n3) operations. Being important parallel-computational models, both BP (back propagation) and Hopfield neural networks could be exploited for solving such linear equations. BP neural network is evidently different from Hopfield neural network in terms of network definition, architecture and learning pattern. However, both of these two neural networks could have a common nature of learning (i.e., governed by the same mathematical iteration formula) during the online solution of linear equations. In addition, computer-simulation results substantiate the theoretical analysis of both BP and Hopfield neural networks for solving online such a set of linear equations.