×

zbMATH — the first resource for mathematics

On the convergence of block coordinate descent type methods. (English) Zbl 1297.90113
The authors present the so-called BCGD (block coordinate gradient descent) method for free smooth convex optimization problems \[ \min\{f(x): x\in\mathbb{R}^n\}, \] where the decision vector \(x\in\mathbb{R}^n\) and the gradient \(\nabla f(x)\) of the objective function is split in suitable blocks of coordinates. Assuming the block-coordinatewise Lipschitz continuity of \(\nabla f(x)\), in any iteration of the BCGD method, one makes gradient steps with constant stepsizes (depending on the Lipschitz constants) with respect to the different blocks of coordinates taken in a cyclic order.
It is shown that this method has a global sublinear rate of convergence. When in addition the function \(f\) is strongly convex than even a linear rate of convergence can be proved. Especially it is pointed out that this deterministic BCGD method can even have a better empirical performance than the randomized approach of Nestorov.
In the second part of the paper the authors give some remarks regarding the acceleration of the BCGD method and regarding the extension to constraint optimization problems.

MSC:
90C25 Convex programming
90C06 Large-scale problems in mathematical programming
65K05 Numerical mathematical programming methods
PDF BibTeX XML Cite
Full Text: DOI