×

VC-dimension on manifolds: a first approach. (English) Zbl 1138.68035

The Vapnik-Chervonenkis dimension of a set of indicator functions on a space is a measure of the ability of these functions to distinguish between sets of point in the space. This is significant in the study of machine learning because it serves to measure the ability of the indicator functions to classify objects by their attributes. Most previous work on the topic of VC-dimension is in the context of sets in Euclidean spaces, but there are naturally arising cases where the spaces of interest are manifolds. For example angles naturally for a circle. This paper begins a study of VC-dimension in the context of manifolds, where a major outstanding problem is the case of product manifolds. The paper considers two kinds of indicator functions on manifolds; indicators derived from Morse functions, and indicators derived from covering maps for the case of surfaces with covers that are Euclidean spaces.

MSC:

68Q32 Computational learning theory
58E05 Abstract critical point theory (Morse theory, Lyusternik-Shnirel’man theory, etc.) in infinite-dimensional spaces
68T05 Learning and adaptive systems in artificial intelligence
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Burges, Data Mining and Knowledge Discovery 2 pp 121– (1998)
[2] . An Introduction to Support Vector Machines. Cambridge University Press: Cambridge, 2000.
[3] Statistical Learning Theory. Wiley: New York, 1998. · Zbl 0935.62007
[4] Support vector machines. http://dsg.harvard.edu/courses/hst951/Spring02/951-12.pdf, 2002.
[5] Erlich, Neural Computation 9 pp 771– (1997)
[6] Cucker, Bulletin of the American Mathematical Society 39 pp 1– (2001)
[7] Belkin, Machine Learning 56 pp 209– (2004)
[8] Algebraic Topology. Springer: Berlin, 1995.
[9] . Directional Statistics. Wiley Series on Probability and Statistics. Wiley: New York, 2000.
[10] Vapnik, Theory of Probability and its Applications 16 pp 264– (1971)
[11] Morse Theory. Annals of Mathematics Studies, vol. 51. Princeton University Press: Princeton, NJ, 1963. · Zbl 0108.10401 · doi:10.1515/9781400881802
[12] VC dimension of neural networks. In Neural Networks and Machine Learning, (ed.). Springer: Berlin, 1998; 69–95.
[13] . Graphs and Applications: An Introductory Approach. Springer: Berlin, 2000. · doi:10.1007/978-1-4471-0467-4
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.