Dissemin is shutting down on January 1st, 2025

Published in

Frontiers Media, Frontiers in Neuroscience, (15), 2021

DOI: 10.3389/fnins.2021.612359

Links

Tools

Export citation

Search in Google Scholar

Bio-Inspired Architectures Substantially Reduce the Memory Requirements of Neural Network Models

Journal article published in 2021 by Thomas Dalgaty, John P. Miller, Elisa Vianello, Jérôme Casas ORCID
This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

We propose a neural network model for the jumping escape response behavior observed in the cricket cercal sensory system. This sensory system processes low-intensity air currents in the animal's immediate environment generated by predators, competitors, and mates. Our model is inspired by decades of physiological and anatomical studies. We compare the performance of our model with a model derived through a universal approximation, or a generic deep learning, approach, and demonstrate that, to achieve the same performance, these models required between one and two orders of magnitude more parameters. Furthermore, since the architecture of the bio-inspired model is defined by a set of logical relations between neurons, we find that the model is open to interpretation and can be understood. This work demonstrates the potential of incorporating bio-inspired architectural motifs, which have evolved in animal nervous systems, into memory efficient neural network models.