Published in

Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.

DOI: 10.1109/ijcnn.2005.1556011

Links

Tools

Export citation

Search in Google Scholar

A research on combination methods for ensembles of multilayer feedforward

Proceedings article published in 1970 by J. Torres Sospedra ORCID, M. Fernandez Redondo, C. Hernandez Espinosa
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

As shown in the bibliography, training an ensemble of networks is an interesting way to improve the performance with respect to a single network. The two key factors to design an ensemble are how to train the individual networks and how to combine the different outputs of the networks to give a single output class. In this paper, we focus on the combination methods. We study the performance of fourteen different combination methods for ensembles of the type "simple ensemble" and "decorrelated". In the case of the "simple ensemble" and low number of networks in the ensemble, the method Zimmermann gets the best performance. When the number of networks is in the range of 9 and 20 the weighted average is the best alternative. Finally, in the case of the ensemble "decorrelated" the best performing method is averaging over a wide spectrum of the number of networks in the ensemble.