zbMATH — the first resource for mathematics

An operator theoretic approach to nonparametric mixture models. (English) Zbl 1431.62274
This research considers the distribution-free estimation of a normal mixture. In detail, in absence of any information on the distribution for each mixture component, it provides a quantification on the number of observations needed – as a function of the number of the mixture components – so that the mixture is identifiable. Additionally, this article quantifies the identifiability of mixtures with linearly independent and jointly irreducible components; it should be noted that the provided results are definite in the sense that they cannot be improved any further. An application, based on multinomial mixture models concludes this work.
62H30 Classification and discrimination; cluster analysis (statistical aspects)
62G05 Nonparametric estimation
Full Text: DOI Euclid arXiv
[1] Allman, E. S., Matias, C. and Rhodes, J. A. (2009). Identifiability of parameters in latent structure models with many observed variables. Ann. Statist.37 3099-3132. · Zbl 1191.62003
[2] Anandkumar, A., Ge, R., Hsu, D., Kakade, S. M. and Telgarsky, M. (2014). Tensor decompositions for learning latent variable models. J. Mach. Learn. Res.15 2773-2832. · Zbl 1319.62109
[3] Anderson, J., Belkin, M., Goyal, N., Rademacher, L. and Voss, J. (2014). The more, the merrier: The blessing of dimensionality for learning large Gaussian mixtures. In Proceedings of the 27th Conference on Learning Theory 1135-1164.
[4] Arora, S., Ge, R., Kannan, R. and Moitra, A. (2012). Computing a nonnegative matrix factorization—provably. In STOC’12—Proceedings of the 2012 ACM Symposium on Theory of Computing 145-161. ACM, New York. · Zbl 1286.15014
[5] Arora, S., Ge, R. and Moitra, A. (2012). Learning topic models—going beyond SVD. In 2012 IEEE 53rd Annual Symposium on Foundations of Computer Science—FOCS 2012 1-10. IEEE Computer Soc., Los Alamitos, CA.
[6] Balan, R., Casazza, P. and Edidin, D. (2006). On signal reconstruction without phase. Appl. Comput. Harmon. Anal.20 345-356. · Zbl 1090.94006
[7] Bhaskara, A., Charikar, M., Moitra, A. and Vijayaraghavan, A. (2014). Smoothed analysis of tensor decompositions. In Proceedings of the Forty-Sixth Annual ACM Symposium on Theory of Computing, STOC’14 594-603. ACM, New York. · Zbl 1315.68203
[8] Blanchard, G. and Scott, C. (2014). Decontamination of mutually contaminated models. In Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics 1-9.
[9] Bruni, C. and Koch, G. (1985). Identifiability of continuous mixtures of unknown Gaussian distributions. Ann. Probab.13 1341-1357. · Zbl 0588.60014
[10] Comon, P., Golub, G., Lim, L.-H. and Mourrain, B. (2008). Symmetric tensors and symmetric tensor rank. SIAM J. Matrix Anal. Appl.30 1254-1279. · Zbl 1181.15014
[11] Donoho, D. and Stodden, V. (2004). When does non-negative matrix factorization give a correct decomposition into parts? In Advances in Neural Information Processing Systems (S. Thrun, L. K. Saul and B. Schölkopf, eds.) 16 1141-1148. MIT Press, Cambridge, MA.
[12] Folland, G. B. (1999). Real Analysis: Modern Techniques and Their Applications, 2nd ed. Pure and Applied Mathematics (New York). Wiley, New York. · Zbl 0924.28001
[13] Golub, G. H. and Van Loan, C. F. (1996). Matrix Computations, 3rd ed. Johns Hopkins Studies in the Mathematical Sciences. Johns Hopkins Univ. Press, Baltimore, MD. · Zbl 0865.65009
[14] Kadison, R. V. and Ringrose, J. R. (1983). Fundamentals of the Theory of Operator Algebras. Vol. I: Elementary Theory. Pure and Applied Mathematics100. Academic Press, New York. · Zbl 0518.46046
[15] Kallenberg, O. (2002). Foundations of Modern Probability, 2nd ed. Probability and Its Applications (New York). Springer, New York. · Zbl 0996.60001
[16] Kruskal, J. B. (1977). Three-way arrays: Rank and uniqueness of trilinear decompositions, with application to arithmetic complexity and statistics. Linear Algebra Appl.18 95-138. · Zbl 0364.15021
[17] Micchelli, C. A., Xu, Y. and Zhang, H. (2006). Universal kernels. J. Mach. Learn. Res.7 2651-2667. · Zbl 1222.68266
[18] Pachter, L. and Speyer, D. (2004). Reconstructing trees from subtree weights. Appl. Math. Lett.17 615-621. · Zbl 1055.05033
[19] Paz, A. (1971). Introduction to Probabilistic Automata. Academic Press, New York. · Zbl 0234.94055
[20] Rabani, Y., Schulman, L. J. and Swamy, C. (2014). Learning mixtures of arbitrary distributions over large discrete domains. In ITCS’14—Proceedings of the 2014 Conference on Innovations in Theoretical Computer Science 207-223. ACM, New York. · Zbl 1366.68247
[21] Song, L., Anandkumar, A., Dai, B. and Xie, B. (2014). Nonparametric estimation of multi-view latent variable models. In Proceedings of the 31st International Conference on Machine Learning, ICML 2014 640-648.
[22] Teicher, H. (1963). Identifiability of finite mixtures. Ann. Math. Stat.34 1265-1269. · Zbl 0137.12704
[23] Vandermeulen, R. A. and Scott, C. D. (2019). Supplement to “An operator theoretic approach to nonparametric mixture models.” DOI:10.1214/18-AOS1762SUPP.
[24] Yakowitz, S. J. and Spragins, J. D. (1968). On the identifiability of finite mixtures. Ann. Math. Stat.39 209-214. · Zbl 0155.25703
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.