Marron, James Stephen Optimal rates of convergence to Bayes risk in nonparametric discrimination. (English) Zbl 0554.62053 Ann. Stat. 11, 1142-1155 (1983). Consider the multiclassification (discrimination) problem with known prior probabilities and a multi-dimensional vector of observations. Assume the underlying densities corresponding to the various classes are unknown but a training sample of size N is available from each class. Rates of convergence to Bayes risk are investigated under smoothness conditions on the underlying densities of the type often seen in nonparametric density estimation. These rates can be drastically affected by a small change in the prior probabilities, so the error criterion used here is Bayes risk averaged (uniformly) over all prior probabilities. Then it is shown that a certain rate, \(N^{-r}\), is optimal in the sense that no rule can do better (uniformly over the class of smooth densities) and a rule is exhibited which does that well. The optimal value of r depends on the smoothness of the distribution and the dimensionality of the observations in the same way as for nonparametric density estimation with integrated square error loss. Cited in 6 Documents MSC: 62H30 Classification and discrimination; cluster analysis (statistical aspects) 62G99 Nonparametric inference 62G05 Nonparametric estimation 62G20 Asymptotic properties of nonparametric inference Keywords:optimal rates; multiclassification; discrimination; known prior probabilities; training sample; Rates of convergence to Bayes risk; smooth densities × Cite Format Result Cite Review PDF Full Text: DOI