Quantifying Generalization in Linearly Weighted Neural Networks (Short title: Quantifying Generalization)

Journal article by Martin Anthony, Sean B. Holden

Full text: Unavailable

Publisher: Unknown publisher

Preprint: policy unknown. Upload

Postprint: policy unknown. Upload

Published version: policy unknown. Upload

Abstract
The Vapnik-Chervonenkis dimension has proven to be of great use in the the- oretical study of generalization in articial neural networks. The 'probably ap- proximately correct' learning framework is described and the importance of the VC dimension is illustrated. We then investigate the VC dimension of certain types of linearly weighted neural networks. First, we obtain bounds on the VC dimensions of radial basis function networks with basis functions of several types. Secondly, we calculate the VC dimension of polynomial discriminant functions dened over both real and binary-valued inputs.