×

Low-rank matrix completion via preconditioned optimization on the Grassmann manifold. (English) Zbl 1312.90092

Summary: We address the numerical problem of recovering large matrices of low rank when most of the entries are unknown. We exploit the geometry of the low-rank constraint to recast the problem as an unconstrained optimization problem on a single Grassmann manifold. We then apply second-order Riemannian trust-region methods (RTRMC 2) and Riemannian conjugate gradient methods (RCGMC) to solve it. A preconditioner for the Hessian is introduced that helps control the conditioning of the problem and we detail preconditioned versions of Riemannian optimization algorithms. The cost of each iteration is linear in the number of known entries. The proposed methods are competitive with state-of-the-art algorithms on a wide range of problem instances. In particular, they perform well on rectangular matrices. We further note that second-order and preconditioned methods are well suited to solve badly conditioned matrix completion tasks.

MSC:

90C90 Applications of mathematical programming
53B21 Methods of local Riemannian geometry
15A83 Matrix completion problems
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Absil, P.-A.; Baker, C. G.; Gallivan, K. A., Trust-region methods on Riemannian manifolds, Found. Comput. Math., 7, 3, 303-330 (2007) · Zbl 1129.65045
[2] Absil, P.-A.; Mahony, R.; Sepulchre, R., Optimization Algorithms on Matrix Manifolds (2008), Princeton University Press: Princeton University Press Princeton, NJ · Zbl 1147.65043
[3] Absil, P.-A.; Amodei, L.; Meyer, G., Two Newton methods on the manifold of fixed-rank matrices endowed with Riemannian quotient geometries, Comput. Statist., 29, 3-4, 569-590 (2014) · Zbl 1306.65015
[4] Balzano, L.; Nowak, R.; Recht, B., Online identification and tracking of subspaces from highly incomplete information, (48th Annual Allerton Conference on Communication, Control, and Computing. 48th Annual Allerton Conference on Communication, Control, and Computing, Allerton, 2010 (2010), IEEE), 704-711
[5] Bennett, J.; Lanning, S., The Netflix prize, (Proceedings of the KDD Cup Workshop 2007 (August 2007), ACM: ACM New York), 3-6
[6] Boothby, W. M., An Introduction to Differentiable Manifolds and Riemannian Geometry, Pure Appl. Math., vol. 120 (1986), Elsevier · Zbl 0596.53001
[7] Boumal, N., Optimization and estimation on manifolds (2014), Université catholique de Louvain: Université catholique de Louvain Louvain-la-Neuve, Belgium, PhD thesis
[8] Boumal, N.; Absil, P.-A., RTRMC: a Riemannian trust-region method for low-rank matrix completion, (Shawe-Taylor, J.; Zemel, R. S.; Bartlett, P.; Pereira, F. C.N.; Weinberger, K. Q., Advances in Neural Information Processing Systems 24. Advances in Neural Information Processing Systems 24, NIPS (2011)), 406-414
[9] Boumal, N.; Mishra, B.; Absil, P.-A.; Sepulchre, R., Manopt, a Matlab toolbox for optimization on manifolds, J. Mach. Learn. Res., 15, 1455-1459 (2014) · Zbl 1319.90003
[10] Cai, J. F.; Candès, E. J.; Shen, Z., A singular value thresholding algorithm for matrix completion, SIAM J. Optim., 20, 4, 1956-1982 (2010) · Zbl 1201.90155
[11] Candès, E. J.; Recht, B., Exact matrix completion via convex optimization, Found. Comput. Math., 9, 6, 717-772 (2009) · Zbl 1219.90124
[12] Chandrasekaran, V.; Sanghavi, S.; Parrilo, P. A.; Willsky, A. S., Rank-sparsity incoherence for matrix decomposition, SIAM J. Optim., 21, 2, 572-596 (2011) · Zbl 1226.90067
[13] Chatterjee, S., Matrix estimation by universal singular value thresholding, Ann. Statist., 43, 1, 177-214 (2015) · Zbl 1308.62038
[14] Conn, A. R.; Gould, N. I.M.; Toint, P. L., Trust-Region Methods, MPS/SIAM Ser. Optim. (2000), Society for Industrial and Applied Mathematics · Zbl 0958.65071
[15] Dai, W.; Milenkovic, O.; Kerman, E., Subspace evolution and transfer (SET) for low-rank matrix completion, IEEE Trans. Signal Process., 59, 7, 3120-3132 (2011) · Zbl 1392.94167
[16] Dai, W.; Kerman, E.; Milenkovic, O., A geometric approach to low-rank matrix completion, IEEE Trans. Inform. Theory, 58, 1, 237-247 (2012) · Zbl 1365.15037
[17] Gilbert, J. C.; Nocedal, J., Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim., 2, 1, 21-42 (1992) · Zbl 0767.90082
[18] Gillis, N., The why and how of nonnegative matrix factorization, (Suykens, J. A.K.; Signoretto, M.; Argyriou, A., Regularization, Optimization, Kernels, and Support Vector Machines. Regularization, Optimization, Kernels, and Support Vector Machines, Mach. Learn. Pattern Recogn. Ser. (2014), Chapman and Hall/CRC), 257-291
[19] Gillis, N.; Glineur, F., Low-rank matrix approximation with weights or missing data is NP-hard, SIAM J. Matrix Anal. Appl., 32, 4, 1149-1165 (2011) · Zbl 1242.65077
[20] Hager, W. W.; Zhang, H., A survey of nonlinear conjugate gradient methods, Pac. J. Optim., 2, 1, 35-58 (2006) · Zbl 1117.90048
[21] Hardt, M., Understanding alternating minimization for matrix completion, (IEEE 55th Annual Symposium on Foundations of Computer Science. IEEE 55th Annual Symposium on Foundations of Computer Science, FOCS, 2014 (Oct 2014), IEEE), 651-660
[22] Kannan, R.; Ishteva, M.; Park, H., Bounded matrix factorization for recommender system, Knowl. Inf. Syst., 39, 491-511 (2014)
[23] Keshavan, R. H.; Montanari, A., Regularization for matrix completion, (IEEE International Symposium on Information Theory Proceedings. IEEE International Symposium on Information Theory Proceedings, ISIT, 2010 (2010), IEEE), 1503-1507
[24] Keshavan, R. H.; Montanari, A.; Oh, S., Matrix completion from noisy entries, J. Mach. Learn. Res., 99, 2057-2078 (2010) · Zbl 1242.62069
[25] Lee, K.; Bresler, Y., ADMiRA: atomic decomposition for minimum rank approximation, IEEE Trans. Inform. Theory, 56, 9, 4402-4416 (2010) · Zbl 1366.94112
[26] Leichtweiss, K., Zur Riemannschen Geometrie in Grassmannschen Mannigfaltigkeiten, Math. Z., 76, 334-366 (1961) · Zbl 0113.37102
[27] Mackey, L.; Talwalkar, A.; Jordan, M. I., Divide-and-conquer matrix factorization, (Shawe-Taylor, J.; Zemel, R. S.; Bartlett, P.; Pereira, F. C.N.; Weinberger, K. Q., Advances in Neural Information Processing Systems 24. Advances in Neural Information Processing Systems 24, NIPS (2011)), 1134-1142
[28] Meyer, G.; Bonnabel, S.; Sepulchre, R., Linear regression under fixed-rank constraints: a Riemannian approach, (28th International Conference on Machine Learning. 28th International Conference on Machine Learning, ICML (2011))
[29] Mishra, B.; Sepulchre, R., Riemannian preconditioning (2014), preprint
[30] Mishra, B.; Meyer, G.; Sepulchre, R., Low-rank optimization for distance matrix completion, (50th IEEE Conference on Decision and Control and European Control Conference. 50th IEEE Conference on Decision and Control and European Control Conference, CDC-ECC, 2011 (2011), IEEE), 4455-4460
[31] Mishra, B.; Adithya Apuroop, K.; Sepulchre, R., A Riemannian geometry for low-rank matrix completion (2012), preprint
[32] Mishra, B.; Meyer, G.; Bach, F.; Sepulchre, R., Low-rank optimization with trace norm penalty, SIAM J. Optim., 23, 4, 2124-2149 (2013) · Zbl 1286.65078
[33] Ngo, T.; Saad, Y., Scaled gradients on Grassmann manifolds for matrix completion, (Pereira, F.; Burges, C. J.C.; Bottou, L.; Weinberger, K. Q., Advances in Neural Information Processing Systems 25 (2012)), 1412-1420
[34] Nocedal, J.; Wright, S. J., Numerical Optimization (1999), Springer-Verlag · Zbl 0930.65067
[35] Recht, B.; Ré, C., Parallel stochastic gradient algorithms for large-scale matrix completion, Math. Program. Comput., 5, 2, 201-226 (2013) · Zbl 1275.90039
[36] Sato, H.; Iwai, T., A new, globally convergent Riemannian conjugate gradient method, Optimization, 64, 4, 1011-1031 (2015) · Zbl 1311.65072
[37] Schneider, R.; Uschmajew, A., Convergence results for projected line-search methods on varieties of low-rank matrices via Łojasiewicz inequality, SIAM J. Optim. (2014), in press, preprint
[38] Tao, M.; Yuan, X., Recovering low-rank and sparse components of matrices from incomplete and noisy observations, SIAM J. Optim., 21, 1, 57-81 (2011) · Zbl 1218.90115
[39] Toh, K. C.; Yun, S., An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems, Pac. J. Optim., 6, 15, 615-640 (2010) · Zbl 1205.90218
[40] Trefethen, L. N.; Bau, D., Numerical Linear Algebra (1997), Society for Industrial and Applied Mathematics · Zbl 0874.65013
[41] Vandereycken, B., Low-rank matrix completion by Riemannian optimization, SIAM J. Optim., 23, 2, 1214-1236 (2013) · Zbl 1277.15021
[42] Wen, Z.; Yin, W.; Zhang, Y., Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm, Math. Program. Comput., 4, 4, 333-361 (2012) · Zbl 1271.65083
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.