×

Optimal functional supervised classification with separation condition. (English) Zbl 1441.62930

The paper deals with binary supervised classification of trajectories of stochastic processes. Basing on the finite training set of trajectories one has to determine to which of two unknown processes the observed trajectory is related. In this paper the authors consider the trajectories of solutions of stochastic differential equations \[ dX_t = \mu(t) dt +dW_t, \qquad t \in [0,1],\quad \mu \in \{f,g\}, \] where the drifts \(f\) and \(g\) belong to the Sobolev-Hilbert space \(H_s (0,1)\), \(s\geq 0\), and are significantly different \(\|f-g\| \geq \Delta >0\).
The goal is to provide a natural classifier such that asymptotically no other classifier is significantly outperforming it.

MSC:

62R10 Functional data analysis
62H30 Classification and discrimination; cluster analysis (statistical aspects)
68T05 Learning and adaptive systems in artificial intelligence
60H10 Stochastic ordinary differential equations (aspects of stochastic analysis)
PDF BibTeX XML Cite
Full Text: DOI arXiv Euclid

References:

[1] Abraham, C., Biau, G. and Cadre, B. (2006). On the kernel rule for function classification. Ann. Inst. Statist. Math. 58 619-633. · Zbl 1100.62066
[2] Audibert, J.-Y. and Tsybakov, A.B. (2007). Fast learning rates for plug-in classifiers. Ann. Statist. 35 608-633. · Zbl 1118.62041
[3] Baíllo, A., Cuevas, A. and Cuesta-Albertos, J.A. (2011). Supervised classification for a family of Gaussian functional models. Scand. J. Stat. 38 480-498. · Zbl 1246.62155
[4] Biau, G. and Devroye, L. (2015). Lectures on the Nearest Neighbor Method. Springer Series in the Data Sciences. Cham: Springer. · Zbl 1330.68001
[5] Biau, G. and Scornet, E. (2016). A random forest guided tour. TEST 25 197-227. · Zbl 1402.62133
[6] Bickel, P.J. and Levina, E. (2004). Some theory of Fisher’s linear discriminant function, ‘naive Bayes’, and some alternatives when there are many more variables than observations. Bernoulli 10 989-1010. · Zbl 1064.62073
[7] Boucheron, S., Bousquet, O. and Lugosi, G. (2005). Theory of classification: A survey of some recent advances. ESAIM Probab. Stat. 9 323-375. · Zbl 1136.62355
[8] Boucheron, S., Lugosi, G. and Massart, P. (2013). Concentration Inequalities. Oxford: Oxford Univ. Press. A nonasymptotic theory of independence, With a foreword by Michel Ledoux. · Zbl 1279.60005
[9] Cadre, B. (2013). Supervised classification of diffusion paths. Math. Methods Statist. 22 213-225. · Zbl 1293.62069
[10] Cai, T.T. and Zhang, L. (2019). High dimensional linear discriminant analysis: Optimality, adaptive algorithm and missing data. J. R. Stat. Soc. Ser. B. Stat. Methodol. 81 675-705. · Zbl 1428.62267
[11] Cérou, F. and Guyader, A. (2006). Nearest neighbor classification in infinite dimension. ESAIM Probab. Stat. 10 340-355. · Zbl 1187.62115
[12] Chaudhuri, K. and Dasgupta, S. (2014). Rates of convergence for nearest neighbor classification. In Advances in Neural Information Processing Systems (Z. Ghahramani, M. Welling, C. Cortes, N.D. Lawrence and K.Q. Weinberger, eds.) 27 3437-3445. Curran Associates.
[13] Chonavel, T. (2002). Statistical Signal Processing. New-York: Springer. · Zbl 1003.94001
[14] Cover, T.M. and Hart, P. (1967). Nearest neighbor pattern classification. IEEE Trans. Inform. Theory 13 21-27. · Zbl 0154.44505
[15] Delaigle, A. and Hall, P. (2012). Achieving near perfect classification for functional data. J. R. Stat. Soc. Ser. B. Stat. Methodol. 74 267-286. · Zbl 1411.62164
[16] Devroye, L., Györfi, L. and Lugosi, G. (1996). A Probabilistic Theory of Pattern Recognition. Applications of Mathematics (New York) 31. New York: Springer. · Zbl 0853.68150
[17] Gadat, S., Gerchinovitz, S. and Marteau, C. (2020). Supplement to “Optimal functional supervised classification with separation condition.” https://doi.org/10.3150/19-BEJ1170SUPP.
[18] Gadat, S., Klein, T. and Marteau, C. (2016). Classification in general finite dimensional spaces with the \(k\)-nearest neighbor rule. Ann. Statist. 44 982-1009. · Zbl 1338.62082
[19] Győrfi, L. (1978). On the rate of convergence of nearest neighbor rules. IEEE Trans. Inform. Theory 24 509-512. · Zbl 0433.62027
[20] Ibragimov, I. and Khasminskii, R. (1981). Statistical Estimation: Asymptotic Theory. New York: Springer.
[21] Ikeda, N. and Watanabe, S. (1989). Stochastic Differential Equations and Diffusion Processes, 2nd ed. North-Holland Mathematical Library 24. Amsterdam: North-Holland. · Zbl 0684.60040
[22] James, G.M. and Hastie, T.J. (2001). Functional linear discriminant analysis for irregularly sampled curves. J. R. Stat. Soc. Ser. B. Stat. Methodol. 63 533-550. · Zbl 0989.62036
[23] Kulkarni, S.R. and Posner, S.E. (1995). Rates of convergence of nearest neighbor estimation under arbitrary sampling. IEEE Trans. Inform. Theory 41 1028-1039. · Zbl 0839.93070
[24] Lamberton, D. and Lapeyre, B. (1996). Introduction to Stochastic Calculus Applied to Finance. London: CRC Press. · Zbl 1167.60001
[25] Lande, R., Engen, S. and Saether (2003). Stochastic Populations Dynamics in Ecology and Conservation. New-York: Oxford Univ. Press Inc. · Zbl 1087.92064
[26] Laurent, B. and Massart, P. (2000). Adaptive estimation of a quadratic functional by model selection. Ann. Statist. 28 1302-1338. · Zbl 1105.62328
[27] Lepskiĭ, O.V. (1990). A problem of adaptive estimation in Gaussian white noise. Teor. Veroyatn. Primen. 35 459-470.
[28] Li, T., Yi, X., Carmanis, X. and Ravikumar, P. (2017). Minimax Gaussian classification & clustering. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics. Proceedings of Machine Learning Research 54 1-9.
[29] Mammen, E. and Tsybakov, A.B. (1999). Smooth discrimination analysis. Ann. Statist. 27 1808-1829. · Zbl 0961.62058
[30] Massart, P. and Nédélec, É. (2006). Risk bounds for statistical learning. Ann. Statist. 34 2326-2366. · Zbl 1108.62007
[31] Rakhlin, A., Sridharan, K. and Tsybakov, A.B. (2017). Empirical entropy, minimax regret and minimax risk. Bernoulli 23 789-824. · Zbl 1380.62176
[32] Rossi, F. and Villa, N. (2008). Recent advances in the use of SVM for functional data classification. In Functional and Operatorial Statistics. Contrib. Statist. 273-280. Heidelberg: Physica-Verlag/Springer.
[33] Samworth, R.J. (2012). Optimal weighted nearest neighbour classifiers. Ann. Statist. 40 2733-2763. · Zbl 1373.62317
[34] Shao, J., Wang, Y., Deng, X. and Wang, S. (2011). Sparse linear discriminant analysis by thresholding for high dimensional data. Ann. Statist. 39 1241-1265. · Zbl 1215.62062
[35] Steinwart, I. and Christmann, A. (2008). Support Vector Machines. Information Science and Statistics. New York: Springer. · Zbl 1203.68171
[36] Wang, J.L., Chiou, J.M. and Müller, H.G. (2016). Functional data analysis. Annu. Rev. Stat. Appl. 3 257-295.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.