zbMATH — the first resource for mathematics

Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution. (English) Zbl 1445.90089
Summary: Recent years have seen a flurry of activities in designing provably efficient nonconvex procedures for solving statistical estimation problems. Due to the highly nonconvex nature of the empirical loss, state-of-the-art procedures often require proper regularization (e.g., trimming, regularized cost, projection) in order to guarantee fast convergence. For vanilla procedures such as gradient descent, however, prior theory either recommends highly conservative learning rates to avoid overshooting, or completely lacks performance guarantees. This paper uncovers a striking phenomenon in nonconvex optimization: even in the absence of explicit regularization, gradient descent enforces proper regularization implicitly under various statistical models. In fact, gradient descent follows a trajectory staying within a basin that enjoys nice geometry, consisting of points incoherent with the sampling mechanism. This “implicit regularization” feature allows gradient descent to proceed in a far more aggressive fashion without overshooting, which in turn results in substantial computational savings. Focusing on three fundamental statistical estimation problems, i.e., phase retrieval, low-rank matrix completion, and blind deconvolution, we establish that the gradient descent achieves near-optimal statistical and computational guarantees without explicit regularization. In particular, by marrying statistical modeling with the generic optimization theory, we develop a general recipe for analyzing the trajectories of iterative algorithms via a leave-one-out perturbation argument. As a by-product, for noisy matrix completion, we demonstrate that the gradient descent achieves a near-optimal error control – measured entrywise and by the spectral norm – which might be of independent interest.

90C26 Nonconvex programming, global optimization
Full Text: DOI
[1] Abbe, E., Fan, J., Wang, K., Zhong, Y.: Entrywise eigenvector analysis of random matrices with low expected rank. arXiv preprint arXiv:1709.09565 (2017)
[2] Aghasi, A., Ahmed, A., Hand, P., Joshi, B.: Branchhull: Convex bilinear inversion from the entrywise product of signals with known signs. Applied and Computational Harmonic Analysis (2019)
[3] Ahmed, A.; Recht, B.; Romberg, J., Blind deconvolution using convex programming, IEEE Transactions on Information Theory, 60, 3, 1711-1732 (2014) · Zbl 1360.94057
[4] Alon, N., Spencer, J.H.: The Probabilistic Method (3rd Edition). Wiley (2008) · Zbl 1148.05001
[5] Bahmani, S., Romberg, J.: Phase retrieval meets statistical learning theory: A flexible convex relaxation. In: Artificial Intelligence and Statistics, pp. 252-260 (2017) · Zbl 1408.62032
[6] Bendory, T., Eldar, Y.C., Boumal, N.: Non-convex phase retrieval from STFT measurements. IEEE Transactions on Information Theory (2017) · Zbl 1390.94093
[7] Bhojanapalli, S., Neyshabur, B., Srebro, N.: Global optimality of local search for low rank matrix recovery. In: Advances in Neural Information Processing Systems, pp. 3873-3881 (2016)
[8] Bousquet, O.; Elisseeff, A., Stability and generalization, Journal of Machine Learning Research, 2, Mar, 499-526 (2002) · Zbl 1007.68083
[9] Bubeck, S.: Convex optimization: Algorithms and complexity. Foundations and Trends \({\textregistered }\) in Machine Learning 8(3-4), 231-357 (2015) · Zbl 1365.90196
[10] Cai, JF; Liu, H.; Wang, Y., Fast rank-one alternating minimization algorithm for phase retrieval, Journal of Scientific Computing, 79, 1, 128-147 (2019) · Zbl 1412.94022
[11] Cai, T.; Zhang, A., ROP: Matrix recovery via rank-one projections, The Annals of Statistics, 43, 1, 102-138 (2015) · Zbl 1308.62120
[12] Cai, TT; Li, X.; Ma, Z., Optimal rates of convergence for noisy sparse phase retrieval via thresholded Wirtinger flow, The Annals of Statistics, 44, 5, 2221-2251 (2016) · Zbl 1349.62019
[13] Candès, E.; Plan, Y., A probabilistic and RIPless theory of compressed sensing, IEEE Transactions on Information Theory, 57, 11, 7235-7254 (2011) · Zbl 1365.94174
[14] Candès, E.; Tao, T., The power of convex relaxation: Near-optimal matrix completion, IEEE Transactions on Information Theory, 56, 5, 2053-2080 (2010) · Zbl 1366.15021
[15] Candès, EJ; Eldar, YC; Strohmer, T.; Voroninski, V., Phase retrieval via matrix completion, SIAM Journal on Imaging Sciences, 6, 1, 199-225 (2013) · Zbl 1280.49052
[16] Candès, EJ; Li, X., Solving quadratic equations via PhaseLift when there are about as many equations as unknowns, Foundations of Computational Mathematics, 14, 5, 1017-1026 (2014) · Zbl 1312.90054
[17] Candès, EJ; Li, X.; Ma, Y.; Wright, J., Robust principal component analysis?, Journal of ACM, 58, 3, 11:1-11:37 (2011) · Zbl 1327.62369
[18] Candès, EJ; Li, X.; Soltanolkotabi, M., Phase retrieval via Wirtinger flow: Theory and algorithms, IEEE Transactions on Information Theory, 61, 4, 1985-2007 (2015) · Zbl 1359.94069
[19] Candès, EJ; Recht, B., Exact matrix completion via convex optimization, Foundations of Computational Mathematics, 9, 6, 717-772 (2009) · Zbl 1219.90124
[20] Candès, EJ; Strohmer, T.; Voroninski, V., Phaselift: Exact and stable signal recovery from magnitude measurements via convex programming, Communications on Pure and Applied Mathematics, 66, 8, 1017-1026 (2013)
[21] Chandrasekaran, V.; Sanghavi, S.; Parrilo, PA; Willsky, AS, Rank-sparsity incoherence for matrix decomposition, SIAM Journal on Optimization, 21, 2, 572-596 (2011) · Zbl 1226.90067
[22] Chen, P., Fannjiang, A., Liu, G.R.: Phase retrieval with one or two diffraction patterns by alternating projections with the null initialization. Journal of Fourier Analysis and Applications, pp. 1-40 (2015) · Zbl 06897259
[23] Chen, Y., Incoherence-optimal matrix completion, IEEE Transactions on Information Theory, 61, 5, 2909-2923 (2015) · Zbl 1359.15022
[24] Chen, Y.; Candès, E., The projected power method: An efficient algorithm for joint alignment from pairwise differences, Communications on Pure and Applied Mathematics, 71, 8, 1648-1714 (2018) · Zbl 06919696
[25] Chen, Y.; Candès, EJ, Solving random quadratic systems of equations is nearly as easy as solving linear systems, Communications on Pure and Applied Mathematics, 70, 5, 822-883 (2017) · Zbl 1379.90024
[26] Chen, Y., Cheng, C., Fan, J.: Asymmetry helps: Eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices. arXiv preprint arXiv:1811.12804 (2018)
[27] Chen, Y.; Chi, Y.; Fan, J.; Ma, C., Gradient descent with random initialization: Fast global convergence for nonconvex phase retrieval, Mathematical Programming, 176, 1-2, 5-37 (2019) · Zbl 1415.90086
[28] Chen, Y., Chi, Y., Fan, J., Ma, C., Yan, Y.: Noisy matrix completion: Understanding statistical guarantees for convex relaxation via nonconvex optimization. arXiv preprint arXiv:1902.07698 (2019)
[29] Chen, Y.; Chi, Y.; Goldsmith, AJ, Exact and stable covariance estimation from quadratic sampling via convex programming, IEEE Transactions on Information Theory, 61, 7, 4034-4059 (2015) · Zbl 1359.62181
[30] Chen, Y.; Fan, J.; Ma, C.; Wang, K., Spectral method and regularized MLE are both optimal for top-\(K\) ranking, Annals of Statistics, 47, 4, 2204-2235 (2019) · Zbl 1425.62038
[31] Chen, Y., Fan, J., Ma, C., Yan, Y.: Inference and uncertainty quantification for noisy matrix completion. arXiv preprint arXiv:1906.04159 (2019) · Zbl 1431.90117
[32] Chen, Y., Wainwright, M.J.: Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees. arXiv preprint arXiv:1509.03025 (2015)
[33] Chen, Y., Yi, X., Caramanis, C.: A convex formulation for mixed regression with two components: Minimax optimal rates. In: Conference on Learning Theory, pp. 560-604 (2014)
[34] Cherapanamjeri, Y., Jain, P., Netrapalli, P.: Thresholding based outlier robust PCA. In: Conference on Learning Theory, pp. 593-628 (2017)
[35] Chi, Y., Guaranteed blind sparse spikes deconvolution via lifting and convex optimization, IEEE Journal of Selected Topics in Signal Processing, 10, 4, 782-794 (2016)
[36] Chi, Y.; Lu, YM, Kaczmarz method for solving quadratic equations, IEEE Signal Processing Letters, 23, 9, 1183-1187 (2016)
[37] Chi, Y., Lu, Y.M., Chen, Y.: Nonconvex optimization meets low-rank matrix factorization: An overview. arXiv preprint arXiv:1809.09573 (2018) · Zbl 07123429
[38] Davenport, MA; Romberg, J., An overview of low-rank matrix recovery from incomplete observations, IEEE Journal of Selected Topics in Signal Processing, 10, 4, 608-622 (2016)
[39] Davis, C.; Kahan, WM, The rotation of eigenvectors by a perturbation. iii., SIAM Journal on Numerical Analysis, 7, 1, 1-46 (1970) · Zbl 0198.47201
[40] Davis, D., Drusvyatskiy, D., Paquette, C.: The nonsmooth landscape of phase retrieval. arXiv preprint arXiv:1711.03247 (2017)
[41] Dhifallah, O., Thrampoulidis, C., Lu, Y.M.: Phase retrieval via linear programming: Fundamental limits and algorithmic improvements. arXiv preprint arXiv:1710.05234 (2017)
[42] Dopico, FM, A note on \(\sin \Theta\) theorems for singular subspace variations, BIT, 40, 2, 395-403 (2000) · Zbl 0961.65034
[43] Duchi, J.C., Ruan, F.: Solving (most) of a set of quadratic equalities: Composite optimization for robust phase retrieval. Information and Inference (2018)
[44] El Karoui, N.: On the impact of predictor geometry on the performance on high-dimensional ridge-regularized generalized robust regression estimators. Probability Theory and Related Fields pp. 1-81 (2015)
[45] El Karoui, N.; Bean, D.; Bickel, PJ; Lim, C.; Yu, B., On robust regression with high-dimensional predictors, Proceedings of the National Academy of Sciences, 110, 36, 14557-14562 (2013) · Zbl 1359.62184
[46] Fan, J., Ma, C., Zhong, Y.: A selective overview of deep learning. arXiv preprint arXiv:1904.05526 (2019)
[47] Gao, B., Xu, Z.: Phase retrieval using Gauss-Newton method. arXiv preprint arXiv:1606.08135 (2016)
[48] Ge, R., Lee, J.D., Ma, T.: Matrix completion has no spurious local minimum. In: Advances in Neural Information Processing Systems, pp. 2973-2981 (2016)
[49] Ge, R., Ma, T.: On the optimization landscape of tensor decompositions. In: Advances in Neural Information Processing Systems, pp. 3653-3663 (2017)
[50] Goldstein, T.; Studer, C., Phasemax: Convex phase retrieval via basis pursuit, IEEE Transactions on Information Theory, 64, 4, 2675-2689 (2018) · Zbl 1390.94194
[51] Gross, D., Recovering low-rank matrices from few coefficients in any basis, IEEE Transactions on Information Theory, 57, 3, 1548-1566 (2011) · Zbl 1366.94103
[52] Gunasekar, S., Woodworth, B.E., Bhojanapalli, S., Neyshabur, B., Srebro, N.: Implicit regularization in matrix factorization. In: Advances in Neural Information Processing Systems, pp. 6151-6159 (2017)
[53] Hand, P.; Voroninski, V., An elementary proof of convex phase retrieval in the natural parameter space via the linear program phasemax, Communications in Mathematical Sciences, 16, 7, 2047-2051 (2018) · Zbl 1441.94040
[54] Hardt, M., Wootters, M.: Fast matrix completion without the condition number. Conference on Learning Theory, pp. 638-678 (2014)
[55] Hastie, T.; Mazumder, R.; Lee, JD; Zadeh, R., Matrix completion and low-rank SVD via fast alternating least squares, Journal of Machine Learning Research, 16, 3367-3402 (2015) · Zbl 1352.65117
[56] Higham, NJ, Estimating the matrix \(p\)-norm, Numerische Mathematik, 62, 1, 539-555 (1992)
[57] Hsu, D.; Kakade, SM; Zhang, T., A tail inequality for quadratic forms of subgaussian random vectors, Electron. Commun. Probab., 17, 52, 6 (2012) · Zbl 1309.60017
[58] Huang, W.; Hand, P., Blind deconvolution by a steepest descent algorithm on a quotient manifold, SIAM Journal on Imaging Sciences, 11, 4, 2757-2785 (2018) · Zbl 1425.65069
[59] Jaganathan, K., Eldar, Y.C., Hassibi, B.: Phase retrieval: An overview of recent developments. arXiv preprint arXiv:1510.07713 (2015)
[60] Jain, P., Netrapalli, P.: Fast exact matrix completion with finite samples. In: Conference on Learning Theory, pp. 1007-1034 (2015)
[61] Jain, P., Netrapalli, P., Sanghavi, S.: Low-rank matrix completion using alternating minimization. In: ACM symposium on Theory of computing, pp. 665-674 (2013) · Zbl 1293.65073
[62] Javanmard, A.; Montanari, A., Debiasing the lasso: Optimal sample size for gaussian designs, The Annals of Statistics, 46, 6, 2593-2622 (2018) · Zbl 1407.62270
[63] Jin, C., Kakade, S.M., Netrapalli, P.: Provable efficient online matrix completion via non-convex stochastic gradient descent. In: Advances in Neural Information Processing Systems, pp. 4520-4528 (2016)
[64] Keshavan, RH; Montanari, A.; Oh, S., Matrix completion from a few entries, IEEE Transactions on Information Theory, 56, 6, 2980-2998 (2010) · Zbl 1366.62111
[65] Keshavan, RH; Montanari, A.; Oh, S., Matrix completion from noisy entries, J. Mach. Learn. Res., 11, 2057-2078 (2010) · Zbl 1242.62069
[66] Koltchinskii, V.: Oracle inequalities in empirical risk minimization and sparse recovery problems, Lecture Notes in Mathematics, vol. 2033. Springer, Heidelberg (2011). 10.1007/978-3-642-22147-7 · Zbl 1223.91002
[67] Koltchinskii, V.; Lounici, K.; Tsybakov, AB, Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion, Ann. Statist., 39, 5, 2302-2329 (2011) · Zbl 1231.62097
[68] Kolte, R., Özgür, A.: Phase retrieval via incremental truncated Wirtinger flow. arXiv preprint arXiv:1606.03196 (2016)
[69] Kreutz-Delgado, K.: The complex gradient operator and the CR-calculus. arXiv preprint arXiv:0906.4835 (2009)
[70] Lang, S.: Real and functional analysis. Springer-Verlag, New York, 10, 11-13 (1993)
[71] Lee, K.; Bresler, Y., Admira: Atomic decomposition for minimum rank approximation, IEEE Transactions on Information Theory, 56, 9, 4402-4416 (2010) · Zbl 1366.94112
[72] Lee, K.; Li, Y.; Junge, M.; Bresler, Y., Blind recovery of sparse signals from subsampled convolution, IEEE Transactions on Information Theory, 63, 2, 802-821 (2017) · Zbl 1364.94261
[73] Lee, K.; Tian, N.; Romberg, J., Fast and guaranteed blind multichannel deconvolution under a bilinear system model, IEEE Transactions on Information Theory, 64, 7, 4792-4818 (2018) · Zbl 1401.94119
[74] Lerman, G.; Maunu, T., Fast, robust and non-convex subspace recovery, Information and Inference: A Journal of the IMA, 7, 2, 277-336 (2017) · Zbl 07127808
[75] Li, Q., Tang, G.: The nonconvex geometry of low-rank matrix optimizations with general objective functions. In: 2017 IEEE Global Conference on Signal and Information Processing (GlobalSIP), pp. 1235-1239. IEEE (2017)
[76] Li, X., Ling, S., Strohmer, T., Wei, K.: Rapid, robust, and reliable blind deconvolution via nonconvex optimization. Applied and computational harmonic analysis (2018) · Zbl 1422.94013
[77] Li, X., Wang, Z., Lu, J., Arora, R., Haupt, J., Liu, H., Zhao, T.: Symmetry, saddle points, and global geometry of nonconvex matrix factorization. arXiv preprint arXiv:1612.09296 (2016) · Zbl 1432.90123
[78] Li, Y., Lee, K., Bresler, Y.: Blind gain and phase calibration for low-dimensional or sparse signal sensing via power iteration. In: Sampling Theory and Applications (SampTA), 2017 International Conference on, pp. 119-123. IEEE (2017)
[79] Li, Y., Ma, C., Chen, Y., Chi, Y.: Nonconvex matrix factorization from rank-one measurements. arXiv preprint arXiv:1802.06286 (2018)
[80] Lin, J., Camoriano, R., Rosasco, L.: Generalization properties and implicit regularization for multiple passes SGM. In: International Conference on Machine Learning, pp. 2340-2348 (2016)
[81] Ling, S.; Strohmer, T., Self-calibration and biconvex compressive sensing, Inverse Problems, 31, 11, 115002 (2015) · Zbl 1327.93183
[82] Ling, S.; Strohmer, T., Regularized gradient descent: a non-convex recipe for fast joint blind deconvolution and demixing, Information and Inference: A Journal of the IMA, 8, 1, 1-49 (2018) · Zbl 07127821
[83] Lu, Y.M., Li, G.: Phase transitions of spectral initialization for high-dimensional nonconvex estimation. arXiv preprint arXiv:1702.06435 (2017)
[84] Mathias, R., The spectral norm of a nonnegative matrix, Linear Algebra Appl., 139, 269-284 (1990) · Zbl 0705.15012
[85] Mathias, R., Perturbation bounds for the polar decomposition, SIAM Journal on Matrix Analysis and Applications, 14, 2, 588-597 (1993) · Zbl 0773.15010
[86] Maunu, T.; Zhang, T.; Lerman, G., A well-tempered landscape for non-convex robust subspace recovery, Journal of Machine Learning Research, 20, 37, 1-59 (2019) · Zbl 07049756
[87] Mei, S.; Bai, Y.; Montanari, A., The landscape of empirical risk for nonconvex losses, The Annals of Statistics, 46, 6, 2747-2774 (2018) · Zbl 1409.62117
[88] Mondelli, M., Montanari, A.: Fundamental limits of weak recovery with applications to phase retrieval. Foundations of Computational Mathematics, pp. 1-71 (2017) · Zbl 07063483
[89] Negahban, S.; Wainwright, MJ, Restricted strong convexity and weighted matrix completion: optimal bounds with noise, J. Mach. Learn. Res., 13, 1665-1697 (2012) · Zbl 1436.62204
[90] Netrapalli, P., Jain, P., Sanghavi, S.: Phase retrieval using alternating minimization. Advances in Neural Information Processing Systems (NIPS) (2013) · Zbl 1394.94421
[91] Netrapalli, P., Niranjan, U., Sanghavi, S., Anandkumar, A., Jain, P.: Non-convex robust PCA. In: Advances in Neural Information Processing Systems, pp. 1107-1115 (2014)
[92] Qing, Q., Zhang, Y., Eldar, Y., Wright, J.: Convolutional phase retrieval via gradient descent. Neural Information Processing Systems (2017)
[93] Recht, B., A simpler approach to matrix completion, Journal of Machine Learning Research, 12, Dec, 3413-3430 (2011) · Zbl 1280.68141
[94] Recht, B.; Fazel, M.; Parrilo, PA, Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization, SIAM Review, 52, 3, 471-501 (2010) · Zbl 1198.90321
[95] Rudelson, M., Vershynin, R., et al.: Hanson-Wright inequality and sub-Gaussian concentration. Electronic Communications in Probability 18 (2013) · Zbl 1329.60056
[96] Sanghavi, S.; Ward, R.; White, CD, The local convexity of solving systems of quadratic equations, Results in Mathematics, 71, 3-4, 569-608 (2017) · Zbl 1383.65064
[97] Schmitt, BA, Perturbation bounds for matrix square roots and Pythagorean sums, Linear Algebra Appl., 174, 215-227 (1992) · Zbl 0758.15006
[98] Schniter, P.; Rangan, S., Compressive phase retrieval via generalized approximate message passing, IEEE Transactions on Signal Processing, 63, 4, 1043-1055 (2015) · Zbl 1394.94506
[99] Schudy, W., Sviridenko, M.: Concentration and moment inequalities for polynomials of independent random variables. In: Symposium on Discrete Algorithms, pp. 437-446. ACM, New York (2012) · Zbl 1423.60039
[100] Shechtman, Y.; Beck, A.; Eldar, YC, GESPAR: Efficient phase retrieval of sparse signals, IEEE Transactions on Signal Processing, 62, 4, 928-938 (2014) · Zbl 1394.94522
[101] Soltanolkotabi, M.: Algorithms and theory for clustering and nonconvex quadratic programming. Ph.D. thesis, Stanford University (2014)
[102] Soltanolkotabi, M., Structured signal recovery from quadratic measurements: Breaking sample complexity barriers via nonconvex optimization, IEEE Transactions on Information Theory, 65, 4, 2374-2400 (2019) · Zbl 1431.94025
[103] Soltanolkotabi, M.; Javanmard, A.; Lee, JD, Theoretical insights into the optimization landscape of over-parameterized shallow neural networks, IEEE Transactions on Information Theory, 65, 2, 742-769 (2019) · Zbl 1428.68255
[104] Soudry, D.; Hoffer, E.; Nacson, MS; Gunasekar, S.; Srebro, N., The implicit bias of gradient descent on separable data, The Journal of Machine Learning Research, 19, 1, 2822-2878 (2018)
[105] Sun, J., Qu, Q., Wright, J.: A geometric analysis of phase retrieval. In: Information Theory (ISIT), 2016 IEEE International Symposium on, pp. 2379-2383. IEEE (2016)
[106] Sun, J.; Qu, Q.; Wright, J., Complete dictionary recovery over the sphere i: Overview and the geometric picture, IEEE Transactions on Information Theory, 63, 2, 853-884 (2017) · Zbl 1364.94164
[107] Sun, R.; Luo, ZQ, Guaranteed matrix completion via non-convex factorization, IEEE Transactions on Information Theory, 62, 11, 6535-6579 (2016) · Zbl 1359.94179
[108] Sur, P., Chen, Y., Candès, E.J.: The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled chi-square. arXiv preprint arXiv:1706.01191, accepted to Probability Theory and Related Fields (2017) · Zbl 1431.62319
[109] Tan, YS; Vershynin, R., Phase retrieval via randomized kaczmarz: Theoretical guarantees, Information and Inference: A Journal of the IMA, 8, 1, 97-123 (2018) · Zbl 07127823
[110] Tanner, J.; Wei, K., Low rank matrix completion by alternating steepest descent methods, Applied and Computational Harmonic Analysis, 40, 2, 417-429 (2016) · Zbl 1336.65047
[111] Tao, T., Topics in Random Matrix Theory (2012), Providence, Rhode Island: Graduate Studies in Mathematics. American Mathematical Society, Providence, Rhode Island
[112] Ten Berge, JM, Orthogonal procrustes rotation for two or more matrices, Psychometrika, 42, 2, 267-276 (1977) · Zbl 0362.92020
[113] Tropp, J.A.: Convex recovery of a structured signal from independent random linear measurements. In: Sampling Theory, a Renaissance, pp. 67-101. Springer (2015) · Zbl 1358.94034
[114] Tropp, JA, An introduction to matrix concentration inequalities, Found. Trends Mach. Learn., 8, 1-2, 1-230 (2015) · Zbl 1391.15071
[115] Tu, S., Boczar, R., Simchowitz, M., Soltanolkotabi, M., Recht, B.: Low-rank solutions of linear matrix equations via procrustes flow. In: International Conference on Machine Learning, pp. 964-973. JMLR. org (2016)
[116] Vershynin, R.: Introduction to the non-asymptotic analysis of random matrices. Compressed Sensing, Theory and Applications, pp. 210-268 (2012)
[117] Wang, G., Giannakis, G., Saad, Y., Chen, J.: Solving most systems of random quadratic equations. In: Advances in Neural Information Processing Systems, pp. 1867-1877 (2017)
[118] Wang, G., Giannakis, G.B., Eldar, Y.C.: Solving systems of random quadratic equations via truncated amplitude flow. IEEE Transactions on Information Theory (2017) · Zbl 1390.90451
[119] Wang, G.; Zhang, L.; Giannakis, GB; Akçakaya, M.; Chen, J., Sparse phase retrieval via truncated amplitude flow, IEEE Transactions on Signal Processing, 66, 2, 479-491 (2018) · Zbl 1414.94656
[120] Wang, L.; Chi, Y., Blind deconvolution from multiple sparse inputs, IEEE Signal Processing Letters, 23, 10, 1384-1388 (2016)
[121] Wedin, PÅ, Perturbation bounds in connection with singular value decomposition, BIT Numerical Mathematics, 12, 1, 99-111 (1972) · Zbl 0239.15015
[122] Wei, K., Solving systems of phaseless equations via Kaczmarz methods: A proof of concept study, Inverse Problems, 31, 12, 125008 (2015) · Zbl 1332.65045
[123] Wei, K.; Cai, JF; Chan, TF; Leung, S., Guarantees of riemannian optimization for low rank matrix recovery, SIAM Journal on Matrix Analysis and Applications, 37, 3, 1198-1222 (2016) · Zbl 1347.65109
[124] Yu, Y.; Wang, T.; Samworth, RJ, A useful variant of the Davis-Kahan theorem for statisticians, Biometrika, 102, 2, 315-323 (2015) · Zbl 1452.15010
[125] Zhang, C., Bengio, S., Hardt, M., Recht, B., Vinyals, O.: Understanding deep learning requires rethinking generalization. International Conference on Learning Representations (2017)
[126] Zhang, H., Chi, Y., Liang, Y.: Provable non-convex phase retrieval with outliers: Median truncated Wirtinger flow. In: International conference on machine learning, pp. 1022-1031 (2016)
[127] Zhang, H., Zhou, Y., Liang, Y., Chi, Y.: A nonconvex approach for phase retrieval: Reshaped wirtinger flow and incremental algorithms. Journal of Machine Learning Research (2017) · Zbl 1440.94020
[128] Zhang, Y., Lau, Y., Kuo, H.w., Cheung, S., Pasupathy, A., Wright, J.: On the global geometry of sphere-constrained sparse blind deconvolution. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4894-4902 (2017)
[129] Zhao, T., Wang, Z., Liu, H.: A nonconvex optimization framework for low rank matrix estimation. In: Advances in Neural Information Processing Systems, pp. 559-567 (2015)
[130] Zheng, Q., Lafferty, J.: A convergent gradient descent algorithm for rank minimization and semidefinite programming from random linear measurements. In: Advances in Neural Information Processing Systems, pp. 109-117 (2015)
[131] Zheng, Q., Lafferty, J.: Convergence analysis for rectangular matrix completion using Burer-Monteiro factorization and gradient descent. arXiv preprint arXiv:1605.07051 (2016)
[132] Zhong, K., Song, Z., Jain, P., Bartlett, P.L., Dhillon, I.S.: Recovery guarantees for one-hidden-layer neural networks. In: International Conference on Machine Learning, pp. 4140-4149. JMLR. org (2017)
[133] Zhong, Y.; Boumal, N., Near-optimal bounds for phase synchronization, SIAM Journal on Optimization, 28, 2, 989-1016 (2018) · Zbl 1396.90068
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.