an:06081227
Zbl 1257.90073
Nesterov, Yu.
Efficiency of coordinate descent methods on huge-scale optimization problems
EN
SIAM J. Optim. 22, No. 2, 341-362 (2012).
00306584
2012
j
90C25 65K05
ordinate relaxation
For the unconstrained minimization of a differentiable convex function \(f(x_{1},\dots,x_{n})\) (\(x_{i}\in \mathbb{R}^{n_{i}}\), \(i=1,\dots,n\)) with a globally Lipschitz gradient, the random coordinate descent method consists in randomly choosing, at iteration \(k\), some \(i_{k}\in \{1,\dots,n\}\) and then updating the current iterate by making a step in the direction of the negative of the partial gradient with respect to \(x_{i_{k}}\), the step size being equal to the reciprocal of the Lipschitz constant of this partial gradient. The expected objective function value is shown to converge to the infimum of \(f\); for strongly convex functions, the rate of convergence is linear and an accelerated version is presented. A modification of the method for constrained problems is also introduced. Implementation issues are discussed, and some preliminary numerical experiments are reported.
Juan-Enrique Mart??nez-Legaz (Barcelona)