×

zbMATH — the first resource for mathematics

Accelerated directional search with non-Euclidean prox-structure. (English. Russian original) Zbl 1434.90143
Autom. Remote Control 80, No. 4, 693-707 (2019); translation from Avtom. Telemekh. 2019, No. 4, 126-143 (2019).
Summary: We consider smooth convex optimization problems whose full gradient is not available for their numerical solution. In [“Random gradient-free minimization of convex functions”, CORE Discussion Paper 2011/1 (2011)] Yu. E. Nesterov proposed accelerated gradient-free methods for solving such problems. Since only unconditional optimization problems were considered, Euclidean prox-structures were used. However, if one knows in advance, say, that the solution to the problem is sparse, or rather that the distance from the starting point to the solution in 1-norm and in 2-norm are close, then it is more advantageous to choose a non-Euclidean prox-structure associated with the 1-norm rather than a prox-structure associated with the 1-norm. In this work we present a complete justification of this statement. We propose an accelerated descent method along a random direction with a non-Euclidean prox-structure for solving unconditional optimization problems (in further work, we propose to extend this approach to an accelerated gradient-free method). We obtain estimates of the rate of convergence for the method and show the difficulties of transferring the above-mentioned approach to conditional optimization problems.

MSC:
90C25 Convex programming
90C56 Derivative-free methods and methods using generalized derivatives
Software:
ACDS; DiffSharp; GitHub
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Nesterov, Yu., Random Gradient-Free Minimization of Convex Functions, CORE Discussion Paper 2011/1, 2011. · Zbl 1380.90220
[2] Nemirovskii, A.S. and Yudin, D.B., Slozhnost’ zadach i effektivnost’ metodov optimizatsii (Complexity of Problems and Efficiency of Optimization Methods), Moscow: Nauka, 1979.
[3] Gasnikov, A.V., Dvurechenskii, P.E., and Nesterov, Yu.E., Stochastic Gradient Methods with Imprecise Oracle, Tr. MFTI, 2016, vol. 8, no. 1, pp. 41-91, arXiv preprint arXiv:1411.4218.
[4] Gasnikov, A.V., Lagunovskaya, A.A., Usmanova, I.N., and Fedorenko, F.A., Gradient-Free Proximal Methods with Inexact Oracle for Convex Stochastic Nonsmooth Optimization Problems on the Simplex, Autom. Remote Control, 2016, vol. 77, no. 11, pp. 2018-2034. · Zbl 1356.93097
[5] Gasnikov, A.V., Krymova, E.A., Lagunovskaya, A.A., Usmanova, I.N., and Fedorenko, F.A., Stochastic Online Optimization. Single-Point and Multi-Point Non-Linear Multi-Armed Bandits. Convex and Strongly-Convex Case, Autom. Remote Control, 2017, vol. 78, no. 2, pp. 224-234. · Zbl 1362.93165
[6] Allen-Zhu, Z. and Orecchia, L., Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent, arXiv preprint. arXiv:1407.1537. · Zbl 1402.90209
[7] Gasnikov, A.V., Sovremennye chislennye metody optimizatsii. Universal’nyi gradientnyi spusk (Modern Numerical Optimization Methods. Universal Gradient Descent), Moscow: MFTI, 2018. https://arxiv. org/ftp/arxiv/papers/1711/1711.00394.pdf
[8] Nesterov, Yu., No article title, Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems, SIAM J. Optim., 22, 341-362 (2012)
[9] Dvurechensky, P., Gasnikov, A., and Tiurin, A., Randomized Similar Triangles Method: A Unifying Framework for Accelerated Randomized Optimization Methods (Coordinate Descent, Directional Search, Derivative-Free Method), arXiv preprint, arXiv:1707.08486.
[10] Gorbunov, E.A., Vorontsova, E.A., and Gasnikov, A.V., On the Upper Bound for the Mathematical Expectation of the Norm of a Vector Uniformly Distributed on the Sphere and the Phenomenon of Concentration of Uniform Measure on the Sphere, arXiv preprint. arXiv:1804.03722. · Zbl 1448.60048
[11] Bayandina, A.S., Gasnikov, A.V., and Lagunovskaya, A.A., Gradient-Free Two-Point Methods for Solving Stochastic Nonsmooth Convex Optimization Problems with Small Non-Random Noises, Autom. Remote Control, 2018, vol. 79, no. 8, pp. 1399-1408. · Zbl 1398.93312
[12] Gasnikov, A.V., Dvurechenskii, P.E., and Usmanova, I.N., About Accelerated Randomized Methods, Tr. MFTI, 2016, vol. 8, no. 2, pp. 67-100, arXiv preprint. arXiv:1508.02182.
[13] Allen-Zhu, Z., Katyusha: The First Direct Acceleration of Stochastic Gradient Methods, arXiv preprint, arXiv:1603.05953. · Zbl 06982977
[14] Guzman, C.; Nemirovski, A., On Lower Complexity Bounds for Large-Scale Smooth Convex Optimization (2015) · Zbl 1304.65155
[15] Gasnikov, A.V. and Nesterov, Yu.E., Universal Method for Stochastic Composite Optimization Problems, Zh. Vychisl. Mat. Mat. Fiz., 2018, vol. 58, no. 1, pp. 52-69, arXiv preprint. arXiv:1604.05275. · Zbl 1457.90099
[16] Baydin, A.G., Pearlmutter, B.A., Radul, A.A., and Siskand, J.M., Automatic Differentiation in Machine Learning: A Survey, arXiv preprint. arXiv:1502.05767. · Zbl 06982909
[17] Bregman, L.M., Relaxation Method for Finding a Common Point of Convex Sets and Its Application to Solving Convex Programming Problems, Zh. Vychisl. Mat. Mat. Fiz., 1967, vol. 7, no. 3, pp. 200-217.
[18] Bogolubsky, L., Dvurechensky, P., Gasnikov, A., Gusev, G., Raigorodskii, A., Tikhonov, A., and Zhukovskii, M., Learning Supervised PageRank with Gradient-Based and Gradient-Free Optimization Methods, Proc. 13th Annual Conf. on Neural Information Processing Systems (NIPS), 2016, arXiv preprint. arXiv:1603.00717.
[19] ACDS method Python code. https://github.com/evorontsova/ACDS
[20] Gasnikov, A.V., Effektivnye chislennye metody poiska ravnovesii v bol’shikh transportnykh setyakh (Efficient Numerical Methods for Finding Equlibria in Large Transportation Networks), Doctoral Dissertation, Specialty 05.13.18—Mathematical Modeling, Numerical Methods, Software Systems, Moscow: MFTI, 2016, arXiv preprint. arXiv:1607.03142.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.