Vapnik, V. Three fundamental concepts of the capacity of learning machines. (English) Zbl 0799.60032 Physica A 200, No. 1-4, 538-544 (1993). Summary: We present three fundamental concepts concerning the capacity of learning machines which describe necessary and sufficient conditions of consistency of the learning processes and the bounds of the rate of convergence of the learning processes both for the distribution dependent case and the distribution independent case. Cited in 2 Documents MSC: 60F99 Limit theorems in probability theory Keywords:capacity of learning machines; rate of convergence PDFBibTeX XMLCite \textit{V. Vapnik}, Physica A 200, No. 1--4, 538--544 (1993; Zbl 0799.60032) Full Text: DOI References: [1] Vapnik, V. N.; Chervonenkis, A. J., Pattern Recognit. Image Anal., 283-305 (1991) [2] Vapnik, V. N.; Chervonenkis, A. J., Theor. Prob. Appl., 26, 265-280 (1981) [3] Vapnik, V. N.; Chervonenkis, A. J., Theor. Prob. Appl., 16, 532-553 (1971) [4] Vapnik, V. N., Estimation of Dependences Based on Empirical Data (1982), Springer: Springer New York · Zbl 0499.62005 This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.