# zbMATH — the first resource for mathematics

Global rates of convergence for nonconvex optimization on manifolds. (English) Zbl 07208096
IMA J. Numer. Anal. 39, No. 1, 1-33 (2019); erratum ibid. 40, No. 4, 2940 (2020).
Summary: We consider the minimization of a cost function $$f$$ on a manifold $$\mathcal{M}$$ using Riemannian gradient descent and Riemannian trust regions (RTR). We focus on satisfying necessary optimality conditions within a tolerance $$\epsilon$$. Specifically, we show that, under Lipschitz-type assumptions on the pullbacks of $$f$$ to the tangent spaces of $$\mathcal{M}$$, both of these algorithms produce points with Riemannian gradient smaller than $$\epsilon$$ in $$\mathcal{O}\big(1/\varepsilon^2\big)$$ iterations. Furthermore, RTR returns a point where also the Riemannian Hessian’s least eigenvalue is larger than $$- \varepsilon$$ in $$\mathcal{O} \big(1/\varepsilon^3\big)$$ iterations. There are no assumptions on initialization. The rates match their (sharp) unconstrained counterparts as a function of the accuracy $$\varepsilon$$ (up to constants) and hence are sharp in that sense. These are the first deterministic results for global rates of convergence to approximate first- and second-order Karush-Kuhn-Tucker points on manifolds. They apply in particular for optimization constrained to compact submanifolds of $$\mathbb{R}^n$$, under simpler assumptions.

##### MSC:
 65-XX Numerical analysis
Full Text: