Dissemin is shutting down on January 1st, 2025

Published in

2008 Second International Symposium on Intelligent Information Technology Application

DOI: 10.1109/iita.2008.128

Links

Tools

Export citation

Search in Google Scholar

On the Variable Step-Size of Discrete-Time Zhang Neural Network and Newton Iteration for Constant Matrix Inversion

Journal article published in 2008 by Yunong Zhang, Binghuang Cai ORCID, Mingjiong Liang, Weimu Ma, Weimu
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

A special kind of recurrent neural network has recently been proposed by Zhang et al for matrix inversion. Then, for possible hardware and digital-circuit realization, the corresponding discrete-time model of Zhang neural net-work (ZNN) is proposed for constant matrix inversion, which reduces exactly to Newton iteration when linear ac-tivation functions and constat step-size 1 are used. In this paper, a variable step-size choosing method is investigated for such a discrete-time ZNN model, in which different vari-able step-size rules are derived for different kinds of acti-vation functions. For comparative purposes, the fixed step-size choosing method is presented as well. Numerical ex-amples demonstrate the efficacy of the discrete-time ZNN model, especially when using the variable step-size method.