Stable rank-one matrix completion is solved by the level \(2\) Lasserre relaxation. (English) Zbl 1478.90076

Summary: This paper studies the problem of deterministic rank-one matrix completion. It is known that the simplest semidefinite programming relaxation, involving minimization of the nuclear norm, does not in general return the solution for this problem. In this paper, we show that in every instance where the problem has a unique solution, one can provably recover the original matrix through the level \(2\) Lasserre relaxation with minimization of the trace norm. We further show that the solution of the proposed semidefinite program is Lipschitz stable with respect to perturbations of the observed entries, unlike more basic algorithms such as nonlinear propagation or ridge regression. Our proof is based on recursively building a certificate of optimality corresponding to a dual sum-of-squares (SoS) polynomial. This SoS polynomial is built from the polynomial ideal generated by the completion constraints and the monomials provided by the minimization of the trace. The proposed relaxation fits in the framework of the Lasserre hierarchy, albeit with the key addition of the trace objective function. Finally, we show how to represent and manipulate the moment tensor in favorable complexity by means of a hierarchical low-rank factorization.


90C22 Semidefinite programming
90C25 Convex programming
65K05 Numerical mathematical programming methods
90C59 Approximation methods and heuristics in mathematical programming


SDPLR; PhaseLift
Full Text: DOI arXiv


[1] A. A. Ahmadi, G. Hall, A. Papachristodoulou, J. Saunderson, and Y. Zheng. Improving efficiency and scalability of sum of squares optimization: Recent advances and limitations. arXiv:1710.01358, 2017.
[2] A. S. Bandeira. Convex relaxations for certain inverse problems on graphs. 2015.
[3] B. Barak and A. Moitra. Tensor prediction, Rademacher complexity and random 3-XOR. arXiv:1501.06521, 2015.
[4] R. Berke and M. Onsjö. Propagation connectivity of random hypergraphs. In Stochastic Algorithms: Foundations and Applications, pages 117-126. Springer, 2009. · Zbl 1260.68444
[5] Burer, S.; Monteiro, R., A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization, Mathematical Programming, 95, 2, 329-357 (2003) · Zbl 1030.90077
[6] Burer, S.; Monteiro, RD, Local minima and convergence in low-rank semidefinite programming, Mathematical Programming, 103, 3, 427-444 (2005) · Zbl 1099.90040
[7] Candès, EJ; Eldar, YC; Strohmer, T.; Voroninski, V., Phase retrieval via matrix completion, SIAM review, 57, 2, 225-251 (2015) · Zbl 1344.49057
[8] Candès, EJ; Fernandez-Granda, C., Towards a mathematical theory of super-resolution, Communications on Pure and Applied Mathematics, 67, 6, 906-956 (2014) · Zbl 1350.94011
[9] Candès, EJ; Plan, Y., Matrix completion with noise, Proceedings of the IEEE, 98, 6, 925-936 (2010)
[10] Candès, EJ; Recht, B., Exact matrix completion via convex optimization, Foundations of Computational mathematics, 9, 6, 717-772 (2009) · Zbl 1219.90124
[11] Candès, EJ; Strohmer, T.; Voroninski, V., Phaselift: Exact and stable signal recovery from magnitude measurements via convex programming, Communications on Pure and Applied Mathematics, 66, 8, 1241-1274 (2013) · Zbl 1335.94013
[12] Candès, EJ; Tao, T., The power of convex relaxation: Near-optimal matrix completion, Information Theory, IEEE Transactions on, 56, 5, 2053-2080 (2010) · Zbl 1366.15021
[13] A. Cosse and L. Demanet. Rank-one matrix completion is solved by the sum-of-squares relaxation of order two. In Proceedings of the 6th IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP’15). IEEE, 2015.
[14] Cui, C-F; Dai, Y-H; Nie, J., All real eigenvalues of symmetric tensors, SIAM Journal on Matrix Analysis and Applications, 35, 4, 1582-1601 (2014) · Zbl 1312.65053
[15] L. Demanet and V. Jugnon. Convex recovery from interferometric measurements. arXiv:1307.6864, 2013.
[16] M. Fazel. Matrix rank minimization with applications. PhD thesis, Stanford University, March 2002.
[17] Goemans, MX; Williamson, DP, Improved approximation algorithms for maximum cut and satisfiability problems using semidefinite programming, Journal of the ACM (JACM), 42, 6, 1115-1145 (1995) · Zbl 0885.68088
[18] Gouveia, J.; Parrilo, PA; Thomas, RR, Theta bodies for polynomial ideals, SIAM Journal on Optimization, 20, 4, 2097-2118 (2010) · Zbl 1213.90190
[19] Jugnon, V.; Demanet, L., Interferometric inversion: a robust approach to linear inverse problems, 5180-5184 (2013), Houston: In Proceedings of SEG Annual Meeting, Houston
[20] R. Keshavan, A. Montanari, and S. Oh. Matrix completion from noisy entries. In Advances in Neural Information Processing Systems, pages 952-960, 2009. · Zbl 1242.62069
[21] R. H. Keshavan, A. Montanari, and S. Oh. Learning low rank matrices from \(O(n)\) entries. In Communication, Control, and Computing, 2008 46th Annual Allerton Conference on, pages 1365-1372. IEEE, 2008.
[22] Keshavan, RH; Montanari, A.; Oh, S., Matrix completion from a few entries, Information Theory, IEEE Transactions on, 56, 6, 2980-2998 (2010) · Zbl 1366.62111
[23] F. Kiraly and L. Theran. Error-minimizing estimates and universal entry-wise error bounds for low-rank matrix completion. In Advances in Neural Information Processing Systems, pages 2364-2372, 2013.
[24] F. Király and R. Tomioka. A combinatorial algebraic approach for the identifiability of low-rank matrix completion. arXiv:1206.6470, 2012. · Zbl 1354.15019
[25] Király, FJ; Theran, L.; Tomioka, R., The algebraic combinatorial approach for low-rank matrix completion, Journal of Machine Learning Research, 16, 1391-1436 (2015) · Zbl 1354.15019
[26] Lasserre, JB, Global optimization with polynomials and the problem of moments, SIAM Journal on Optimization, 11, 3, 796-817 (2001) · Zbl 1010.90061
[27] Lasserre, JB, Convergent SDP-relaxations in polynomial optimization with sparsity, SIAM Journal on Optimization, 17, 3, 822-843 (2006) · Zbl 1119.90036
[28] M. Laurent. Sums of squares, moment matrices and optimization over polynomials. In Emerging applications of algebraic geometry, pages 157-270. Springer, 2009. · Zbl 1163.13021
[29] Y. Nesterov. Squared functional systems and optimization problems. In High performance optimization, pages 405-440. Springer, 2000. · Zbl 0958.90090
[30] Nie, J., An exact jacobian SDP relaxation for polynomial optimization, Mathematical Programming, 137, 1-2, 225-255 (2013) · Zbl 1266.65094
[31] Nie, J.; Demmel, J., Sparse SoS relaxations for minimizing functions that are summations of small polynomials, SIAM Journal on Optimization, 19, 4, 1534-1558 (2008) · Zbl 1178.65048
[32] P. A. Parrilo. Structured semidefinite programs and semialgebraic geometry methods in robustness and optimization. PhD thesis, Citeseer, 2000.
[33] Parrilo, PA, Semidefinite programming relaxations for semialgebraic problems, Mathematical programming, 96, 2, 293-320 (2003) · Zbl 1043.14018
[34] Recht, B., A simpler approach to matrix completion, The Journal of Machine Learning Research, 12, 3413-3430 (2011) · Zbl 1280.68141
[35] Recht, B.; Fazel, M.; Parrilo, P., Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization, SIAM Review, 52, 3, 471-501 (2010) · Zbl 1198.90321
[36] R. T. Rockafellar. Convex analysis. Number 28. Princeton university press, 1970.
[37] Shor, N., Class of global minimum bounds of polynomial functions, Cybernetics and Systems Analysis, 23, 6, 731-734 (1987) · Zbl 0648.90058
[38] Shor, NZ, Quadratic optimization problems, Soviet Journal of Computer and Systems Sciences, 25, 6, 1-11 (1987) · Zbl 0655.90055
[39] Shor, NZ, An approach to obtaining global extremums in polynomial mathematical programming problems, Cybernetics, 23, 5, 695-700 (1988) · Zbl 0654.90076
[40] Singer, A.; Cucuringu, M., Uniqueness of low-rank matrix completion by rigidity theory, SIAM Journal on Matrix Analysis and Applications, 31, 4, 1621-1641 (2010) · Zbl 1221.15038
[41] G. Tang and P. Shah. Guaranteed tensor decomposition: A moment approach. In Proceedings of The 32nd International Conference on Machine Learning, pages 1491-1500, 2015.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.