Links

Tools

Export citation

Search in Google Scholar

Quantifying Generalization in Linearly Weighted Neural Networks (Short title: Quantifying Generalization)

Journal article published in 2000 by Martin Anthony, Sean B. Holden
This paper was not found in any repository; the policy of its publisher is unknown or unclear.
This paper was not found in any repository; the policy of its publisher is unknown or unclear.

Full text: Unavailable

Question mark in circle
Preprint: policy unknown
Question mark in circle
Postprint: policy unknown
Question mark in circle
Published version: policy unknown

Abstract

The Vapnik-Chervonenkis dimension has proven to be of great use in the the- oretical study of generalization in articial neural networks. The 'probably ap- proximately correct' learning framework is described and the importance of the VC dimension is illustrated. We then investigate the VC dimension of certain types of linearly weighted neural networks. First, we obtain bounds on the VC dimensions of radial basis function networks with basis functions of several types. Secondly, we calculate the VC dimension of polynomial discriminant functions dened over both real and binary-valued inputs.