×

zbMATH — the first resource for mathematics

Leave-one-out bounds for kernel methods. (English) Zbl 1085.68144
Summary: In this article, we study leave-one-out style cross-validation bounds for kernel methods. The essential element in our analysis is a bound on the parameter estimation stability for regularized kernel formulations. Using this result, we derive bounds on expected leave-one-out cross-validation errors, which lead to expected generalization bounds for various kernel algorithms. In addition, we also obtain variance bounds for leave-oneout errors. We apply our analysis to some classification and regression problems and compare them with previous results.

MSC:
68T05 Learning and adaptive systems in artificial intelligence
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] DOI: 10.1007/BF01199316 · Zbl 0805.62037
[2] DOI: 10.1162/153244302760200704 · Zbl 1007.68083
[3] DOI: 10.1090/S0273-0979-01-00923-5 · Zbl 0983.68162
[4] DOI: 10.1214/aos/1176345462 · Zbl 0481.62035
[5] DOI: 10.1023/A:1018946025316 · Zbl 0939.68098
[6] DOI: 10.1162/089976699300016304
[7] DOI: 10.1023/A:1018967708053 · Zbl 0939.41012
[8] DOI: 10.1023/A:1018996230401 · Zbl 0907.65006
[9] DOI: 10.1214/aos/1176349952 · Zbl 0604.62017
[10] DOI: 10.1006/jath.1997.3137 · Zbl 0904.41013
[11] DOI: 10.1214/aos/1017939142 · Zbl 0978.62008
[12] DOI: 10.1023/A:1012498226479 · Zbl 0998.68100
[13] Zhang T., Annals of Statitics.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.