# zbMATH — the first resource for mathematics

Subgradient methods for sharp weakly convex functions. (English) Zbl 1461.65140
Summary: Subgradient methods converge linearly on a convex function that grows sharply away from its solution set. In this work, we show that the same is true for sharp functions that are only weakly convex, provided that the subgradient methods are initialized within a fixed tube around the solution set. A variety of statistical and signal processing tasks come equipped with good initialization and provably lead to formulations that are both weakly convex and sharp. Therefore, in such settings, subgradient methods can serve as inexpensive local search procedures. We illustrate the proposed techniques on phase retrieval and covariance estimation problems.

##### MSC:
 65K05 Numerical mathematical programming methods 90C25 Convex programming 90C56 Derivative-free methods and methods using generalized derivatives
##### Keywords:
subgradient; weakly convex; sharp; error bound; linear rate
Wirtinger Flow
Full Text:
##### References:
 [1] Jane, P., Netrapalli, P.: Fast exact matrix completion with finite samples. In: Grnwald, P., Hazan, E., Kale, S. (eds.) Proceedings of The 28th Conference on Learning Theory, Proceedings of Machine Learning Research, vol. 40, pp. 1007-1034. PMLR, Paris, France (2015). http://proceedings.mlr.press/v40/Jain15.html [2] Tu, S., Boczar, R., Simchowitz, M., Soltanolkotabi, M., Recht, B.: Low-rank solutions of linear matrix equations via procrustes flow. In: Proceedings of the 33rd International Conference on International Conference on Machine Learning—Volume 48, ICML’16, pp. 964-973. JMLR.org (2016). http://dl.acm.org/citation.cfm?id=3045390.3045493 [3] Candès, E.; Li, X.; Soltanolkotabi, M., Phase retrieval via Wirtinger flow: theory and algorithms, IEEE Trans. Inf. Theory, 61, 1985-2007, (2015) · Zbl 1359.94069 [4] Jain, P., Jin, C., Kakade, S., Netrapalli, P.: Global convergence of non-convex gradient descent for computing matrix squareroot. In: Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS), pp. 479-488 (2017). http://proceedings.mlr.press/v54/jain17a.html [5] Chen, Y., Wainwright, M.: Fast low-rank estimation by projected gradient descent: general statistical and algorithmic guarantees (2015). arXiv:1509.03025 [6] Meka, R., Jain, P., Dhillon, I.: Guaranteed rank minimization via singular value projection (2009). arXiv:0909.5457 [7] Boumal, N., Nonconvex phase synchronization, SIAM J. Optim., 26, 2355-2377, (2016) · Zbl 1356.90111 [8] Brutzkus, A., Globerson, A.: Globally optimal gradient descent for a ConvNet with Gaussian inputs (2017). arXiv:1702.07966 [9] Eremin, I., The relaxation method of solving systems of inequalities with convex functions on the left-hand side, Dokl. Akad. Nauk SSSR, 160, 994-996, (1965) [10] Poljak, B., Minimization of unsmooth functionals, USSR Comput. Math. Math. Phys., 9, 14-29, (1969) · Zbl 0229.65056 [11] Shor, N., The rate of convergence of the method of the generalized gradient descent with expansion of space, Kibernetika (Kiev), 2, 80-85, (1970) · Zbl 0243.90038 [12] Goffin, J., On convergence rates of subgradient optimization methods, Math. Program., 13, 329-347, (1977) · Zbl 0368.90119 [13] Supittayapornpong, S., Neely, M.: Staggered time average algorithm for stochastic non-smooth optimization with $${{O}}(1/t)$$ convergence (2016). arXiv:1607.02842 [14] Yang, T., Lin, Q.: RSG: beating subgradient method without smoothness and strong convexity (2016). arXiv:1512.03107 · Zbl 1444.90092 [15] Johnstone, P., Moulin, P.: Faster subgradient methods for functions with Hölderian growth (2017). arXiv:1704.00196 [16] Polyak, B.: Sharp minima. Institute of Control Sciences Lecture Notes, Moscow, USSR; Presented at the IIASA Workshop on Generalized Lagrangians and Their Applications, IIASA, Laxenburg, Austria, vol. 3, pp. 369-380 (1979) [17] Burke, J.; Ferris, M., Weak sharp minima in mathematical programming, SIAM J. Control Optim., 31, 1340-1359, (1993) · Zbl 0791.90040 [18] Poljak, B.: Subgradient methods: a survey of Soviet research. In: Nonsmooth Optimization (Proc. IIASA Workshop, Laxenburg, 1977), IIASA Proc. Ser., vol. 3, pp. 5-29. Pergamon, Oxford (1978) [19] Nurminskii, EA, The quasigradient method for the solving of the nonlinear programming problems, Cybernetics, 9, 145-150, (1973) [20] Nurminskii, EA, Minimization of nondifferentiable functions in the presence of noise, Cybernetics, 10, 619-621, (1974) [21] Davis, D., Drusvyatskiy, D.: Stochastic subgradient method converges at the rate $$O(k^{-1/4})$$ on weakly convex functions (2018). arXiv:1802.02988 [22] Drusvyatskiy, D., Lewis, A.: Error bounds, quadratic growth, and linear convergence of proximal methods. To appear in Mathematics of Operations Research (2016). arXiv:1602.06661 [23] Duchi, J., Ruan, F.: Stochastic methods for composite optimization problems (2017). Preprint arXiv:1703.08570 [24] Duchi, J., Ruan, F.: Solving (most) of a set of quadratic equalities: composite optimization for robust phase retrieval (2017). arXiv:1705.02356 [25] Drusvyatskiy, D., Paquette, C.: Efficiency of minimizing compositions of convex functions and smooth maps (2016). Preprint arXiv:1605.00125 [26] Davis, D., Grimmer, B.: Proximally guided stochastic method for nonsmooth, nonconvex problems (2017). Preprint arXiv:1707.03505 [27] Drusvyatskiy, D.: The proximal point method revisited. To appear in SIAG/OPT Views and News (2018). arXiv:1712.06038 [28] Rockafellar, R., Wets, R.B.: Variational Analysis. Grundlehren der mathematischen Wissenschaften, vol. 317. Springer, Berlin (1998) [29] Mordukhovich, B.: Variational Analysis and Generalized Differentiation I: Basic Theory. Grundlehren der mathematischen Wissenschaften, vol. 330. Springer, Berlin (2006) [30] Penot, J.P.: Calculus Without Derivatives, Graduate Texts in Mathematics, vol. 266. Springer, New York (2013). https://doi.org/10.1007/978-1-4614-4538-8 · Zbl 1264.49014 [31] Ioffe, A.: Variational Analysis of Regular Mappings. Springer Monographs in Mathematics. Springer, Cham (2017). https://doi-org.offcampus.lib.washington.edu/10.1007/978-3-319-64277-2. Theory and Applications · Zbl 1381.49001 [32] Poliquin, R.; Rockafellar, R., Prox-regular functions in variational analysis, Trans. Am. Math. Soc., 348, 1805-1838, (1996) · Zbl 0861.49015 [33] Eldar, Y.; Mendelson, S., Phase retrieval: stability and recovery guarantees, Appl. Comput. Harmon. Anal., 36, 473-494, (2014) · Zbl 06298184 [34] Davis, D., Drusvyatskiy, D., Paquette, C.: The nonsmooth landscape of phase retrieval (2017). arXiv:1711.03247 [35] Chen, Yuxin; Candès, Emmanuel J., Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems, Communications on Pure and Applied Mathematics, 70, 822-883, (2016) · Zbl 1379.90024 [36] Sun, J., Qu, Q., Wright, J.: A geometric analysis of phase retrieval. To appear in Foundations of Computational Mathematic (2017). arXiv:1602.06664 · Zbl 1401.94049 [37] Tan, Y., Vershynin, R.: Phase retrieval via randomized Kaczmarz: theoretical guarantees (2017). arXiv:1605.08285 [38] Chen, Y.; Chi, Y.; Goldsmith, AJ, Exact and stable covariance estimation from quadratic sampling via convex programming, IEEE Trans. Inf. Theory, 61, 4034-4059, (2015) · Zbl 1359.62181 [39] Clarke, F.: Optimization and Nonsmooth Analysis. Wiley, New York (1983) · Zbl 0582.49001
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.