×

Low rank multivariate regression. (English) Zbl 1274.62434

Summary: We consider in this paper the multivariate regression problem, when the target regression matrix \(A\) is close to a low rank matrix. Our primary interest is in on the practical case where the variance of the noise is unknown. Our main contribution is to propose in this setting a criterion to select among a family of low rank estimators and prove a non-asymptotic oracle inequality for the resulting estimator. We also investigate the easier case where the variance of the noise is known and outline that the penalties appearing in our criterions are minimal (in some sense). These penalties involve the expected value of Ky-Fan norms of some random matrices. These quantities can be evaluated easily in practice and upper-bounds can be derived from recent results in random matrix theory.

MSC:

62H99 Multivariate analysis
62H12 Estimation in multivariate analysis
60B20 Random matrices (probabilistic aspects)
62J05 Linear regression; mixed models
PDFBibTeX XMLCite
Full Text: DOI arXiv Euclid

References:

[1] T.W. Anderson. Estimating linear restrictions on regression coefficients for multivariate normal distribution. Annals of Mathematical Statistics 22 (1951), 327-351. · Zbl 0043.13902 · doi:10.1214/aoms/1177729580
[2] C.W. Anderson, E.A. Stolz, and S. Shamsunder. Multivariate autoregressive models for classification of spontaneous electroencephalogram during mental tasks. IEEE Trans. on bio-medical engineering, 45 no 3 (1998), 277-286.
[3] F. Bach. Consistency of trace norm minimization, Journal of Machine Learning Research, 9 (2008), 1019-1048. · Zbl 1225.68146
[4] Y. Baraud, C. Giraud and S. Huet. Estimator selection in the Gaussian setting. arXiv, :1007.2096v2 · Zbl 1298.62113
[5] L. Birgé and P. Massart. Minimal penalties for Gaussian model selection. Probability Theory and Related Fields, 138 no 1-2 (2007), 33-73. · Zbl 1112.62082 · doi:10.1007/s00440-006-0011-8
[6] E.N. Brown, R.E. Kass, and P.P. Mitra. Multiple neural spike train data analysis: state-of-the-art and future challenges. Nature Neuroscience, 7 no 5 (2004), 456-461.
[7] F. Bunea, Y. She and M. Wegkamp. Adaptive rank Penalized Estimators in Multivariate Regression. arXiv :1004.2995v1, (2010)
[8] F. Bunea, Y. She and M. Wegkamp. Optimal selection of reduced rank estimation of high-dimensional matrices., To appear in the Annals of Statist. · Zbl 1216.62086 · doi:10.1214/11-AOS876
[9] Davidson and Szarek. Handbook of the Geometry of Banach Spaces. North-Holland Publishing Co., Amsterdam, 2001. · Zbl 1067.46008
[10] C. Giraud. Low rank multivariate regression. arXiv :1009.5165v1 (Sept., 2010) · Zbl 1274.62434 · doi:10.1214/11-EJS625
[11] C. Giraud. A pseudo RIP for multivariate regression. arXiv :1106.5599v1, (2011)
[12] L. Harrison, W.D. Penny, and K. Friston. Multivariate autoregressive modeling of fmri time series. NeuroImage, 19 (2004), 1477-1491
[13] R.A. Horn and C.R. Johnson. Topics in Matrix Analysis. Cambridge University Press, Cambridge, 1994. · Zbl 0801.15001
[14] A.J. Izenman. Reduced-rank regression for the multivariate linear model. Journal of Multivariate analysis 5 (1975), 248-262. · Zbl 0313.62042 · doi:10.1016/0047-259X(75)90042-1
[15] A.J. Izenman. Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning. Springer, New York, 2008. · Zbl 1155.62040 · doi:10.1007/978-0-387-78189-1
[16] V. Koltchinskii, A. Tsybakov and K. Lounici. Nuclear norm penalization and optimal rates for noisy low rank matrix completion., To appear in the Annals of Statist. · Zbl 1231.62097 · doi:10.1214/11-AOS894
[17] M. Ledoux. The concentration of measure phenomenon. Mathematical Surveys and Monographs, 89. American Mathematical Society, Providence, 2001. · Zbl 0995.60002
[18] Z. Lu, R. Monteiro and M. Yuan. Convex optimization methods for dimension reduction and coefficient estimation in multivariate linear regression. Mathematical Programming (to, appear). · Zbl 1246.90120 · doi:10.1007/s10107-010-0350-1
[19] V. A. Marchenko, L.A. Pastur. Distribution of eigenvalues for some sets of random matrices. Mat. Sb. (N.S.), 72(114):4 (1967), 507-536.
[20] S. Negahban and M.J. Wainwright. Estimation of (near) low-rank matrices with noise and high-dimensional scaling. Annals of Statist. 39 (2011), no. 2, 1069-1097. · Zbl 1216.62090 · doi:10.1214/10-AOS850
[21] A. Rohde, A.B. Tsybakov. Estimation of High-Dimensional Low-Rank Matrices. Ann. Statist. Volume 39, Number 2 (2011), 887-930. · Zbl 1215.62056 · doi:10.1214/10-AOS860
[22] M. Rudelson, R. Vershynin. Non-asymptotic theory of random matrices: extreme singular values. Proceedings of the International Congress of Mathematicians, Hyderabad, India, (2010). · Zbl 1227.60011
[23] M. Yuan, A. Ekici, Z. Lu and R. Monteiro. Dimension Reduction and Coefficient Estimation in Multivariate Linear Regression. Journal of the Royal Statistical Society, Series B, 69 (2007), 329-346. · doi:10.1111/j.1467-9868.2007.00591.x
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.