Published in

The 2006 IEEE International Joint Conference on Neural Network Proceedings

DOI: 10.1109/ijcnn.2006.1716823

The 2006 IEEE International Joint Conference on Neural Network Proceedings

DOI: 10.1109/ijcnn.2006.247272

Links

Tools

Export citation

Search in Google Scholar

Information Rate Maximization over a Resistive Grid

Proceedings article published in 1970 by H. Koeppl
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

The work presents the first results of the authors research on adaptive cellular neural networks (CNN) based on a global information theoretic cost-function. It considers the simplest case of optimizing a resistive grid such that the Shannon information rate across the input-output boundaries of the grid is maximized. Besides its importance in information theory, information rate has been proven to be a useful concept for principal as well independent component analysis (PCA, ICA). In contrast to linear fully connected neural networks, resistive grids due to their local coupling can resemble models of physical media and are feasible for a VLSI implementation. Results for spatially invariant as well as for the spatially variant case are presented and their relation to principal subspace analysis (PSA) is outlined. Simulation results show the validity of the proposed results.