×

zbMATH — the first resource for mathematics

An optimal subgradient algorithm for large-scale bound-constrained convex optimization. (English) Zbl 1380.90215
Summary: This paper shows that the optimal subgradient algorithm (OSGA) – which uses first-order information to solve convex optimization problems with optimal complexity – can be used to efficiently solve arbitrary bound-constrained convex optimization problems. This is done by constructing an explicit method as well as an inexact scheme for solving the bound-constrained rational subproblem required by OSGA. This leads to an efficient implementation of OSGA on large-scale problems in applications arising from signal and image processing, machine learning and statistics. Numerical experiments demonstrate the promising performance of OSGA on such problems. A software package implementing OSGA for bound-constrained convex problems is available.

MSC:
90C25 Convex programming
90C60 Abstract computational complexity for mathematical programming problems
49M37 Numerical methods based on nonlinear programming
65K05 Numerical mathematical programming methods
68Q25 Analysis of algorithms and problem complexity
Software:
LBFGS-B; NNLS; OSGA; TRON
PDF BibTeX XML Cite
Full Text: DOI arXiv
References:
[1] Ahookhosh M (2014) User’s manual for OSGA (optimal subgradient algorithm). http://homepage.univie.ac.at/masoud.ahookhosh/uploads/User’s_manual_for_OSGA.pdf · Zbl 1391.94442
[2] Ahookhosh M (2016) Optimal subgradient algorithms with application to large-scale linear inverse problems. Submitted. arXiv:1402.7291
[3] Ahookhosh M, Neumaier A (2013) High-dimensional convex optimization via optimal affine subgradient algorithms. In: ROKS workshop, pp 83-84 · Zbl 1186.65049
[4] Ahookhosh M, Neumaier A (2016) Solving structured nonsmooth convex optimization with complexity \(\cal{O}(ε ^{-1/2})\) (submitted) · Zbl 0957.65064
[5] Ahookhosh M, Neumaier A (2016) An optimal subgradient algorithm with subspace search for costly convex optimization problems (submitted) · Zbl 1412.90105
[6] Ahookhosh M, Neumaier A (2017) Optimal subgradient algorithms for large-scale convex optimization in simple domains. Numerical Algorithms · Zbl 1411.90261
[7] Bardsley, J; Vogel, CR, A nonnegatively constrained convex programming method for image reconstruction, SIAM J Sci Comput, 25, 1326-1343, (2003) · Zbl 1061.65047
[8] Beck, A; Teboulle, M, Fast gradient-based algorithms for constrained total variation image denoising and deblurring, IEEE Trans Image Process, 18, 2419-2434, (2009) · Zbl 1371.94049
[9] Birgin, EG; Martinez, JM; Raydan, M, Nonmonotone spectral projected gradient methods on convex sets, SIAM J Optim, 10, 1196-1211, (2000) · Zbl 1047.90077
[10] Boţ, RI; Hendrich, C, A Douglas-Rachford type primal-dual method for solving inclusions with mixtures of composite and parallel-sum type monotone operators, SIAM J Optim, 23, 2541-2565, (2013) · Zbl 1295.47066
[11] Boţ, RI; Csetnek, ER; Hendrich, C, A primal-dual splitting algorithm for finding zeros of sums of maximally monotone operators, SIAM J Optim, 23, 2011-2036, (2013) · Zbl 1314.47102
[12] Boyd S, Xiao L, Mutapcic A (2003) Subgradient methods. http://www.stanford.edu/class/ee392o/subgrad_method.pdf · Zbl 0940.65065
[13] Branch, MA; Coleman, TF; Li, Y, A subspace, interior, and conjugate gradient method for large-scale bound-constrained minimization problems, SIAM J Sci Comput, 21, 1-23, (1999) · Zbl 0940.65065
[14] Byrd, RH; Lu, P; Nocedal, J; Zhu, C, A limited memory algorithm for bound constrained optimization, SIAM J Sci Comput, 16, 1190-1208, (1995) · Zbl 0836.65080
[15] Chambolle A, Caselles V, Cremers D, Novaga M, Pock T (2010) An introduction to total variation for image analysis, In: Theoretical foundations and numerical methods for sparse recovery, vol. 9, De Gruyter, Radon Series Comp. Appl. Math, pp 263-340 · Zbl 1209.94004
[16] Chan, RH; Tao, M; Yuan, X, Constrained total variation deblurring models and fast algorithms based on alternating direction method of multipliers, SIAM J Imaging Sci, 6, 680-697, (2013) · Zbl 1279.68322
[17] Dai, YH; Fletcher, R, New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds, Math Program, 106, 403-421, (2006) · Zbl 1134.90030
[18] Elfving, T; Hansen, PC; Nikazad, T, Semiconvergence and relaxation parameters for projected SIRT algorithms, SIAM J Sci Comput, 34, 2000-2017, (2012) · Zbl 1254.65044
[19] Esser, E; Lou, Y; Xin, J, A method for finding structured sparse solutions to nonnegative least squares problems with applications, SIAM J Imaging Sci, 6, 2010-2046, (2013) · Zbl 1282.90239
[20] Karmitsa, N; Mäkelä, MM, Limited memory bundle method for large bound constrained nonsmooth optimization: convergence analysis, Optim Methods Softw, 25, 895-916, (2010) · Zbl 1198.90359
[21] Karmitsa, N; Mäkelä, MM, Adaptive limited memory bundle method for bound constrained large-scale nonsmooth optimization, Optimization, 59, 945-962, (2010) · Zbl 1195.90073
[22] Kaufman, L; Neumaier, A, PET regularization by envelope guided conjugate gradients, IEEE Trans Med Imaging, 15, 385-389, (1996)
[23] Kaufman, L; Neumaier, A, Regularization of ill-posed problems by envelope guided conjugate gradients, J Comput Graph Stat, 6, 451-463, (1997)
[24] Kim, D; Sra, S; Dhillon, IS, Tackling box-constrained optimization via a new projected quasi-Newton approach, SIAM J Sci Comput, 32, 3548-3563, (2010) · Zbl 1220.93085
[25] Hager, WW; Zhang, H, A new active set algorithm for box constrained optimization, SIAM J Optim, 17, 526-557, (2006) · Zbl 1165.90570
[26] Hager, WW; Zhang, H, The limited memory conjugate gradient method, SIAM J Optim, 23, 2150-2168, (2013) · Zbl 1298.90129
[27] Helgason, R; Kennington, J; Lall, H, A polynomially bound algorithms for a singly constrained quadratic program, Math Program, 18, 338-343, (1980) · Zbl 0452.90054
[28] Lin, CJ; Moré, JJ, Newton’s method for large bound-constrained optimization problems, SIAM J Optim, 9, 1100-1127, (1999) · Zbl 0957.65064
[29] Morini, S; Porcelli, M; Chan, RH, A reduced Newton method for constrained linear least squares problems, J Comput Appl Math, 233, 2200-2212, (2010) · Zbl 1186.65049
[30] Nemirovsky AS, Yudin DB (1983) Problem complexity and method efficiency in optimization. Wiley, New York
[31] Nesterov Y (2004) Introductory lectures on convex optimization: a basic course. Kluwer, Dordrecht · Zbl 1086.90045
[32] Nesterov, Y, Smooth minimization of non-smooth functions, Math Program, 103, 127-152, (2005) · Zbl 1079.90102
[33] Neumaier, A, Solving ill-conditioned and singular linear systems: a tutorial on regularization, SIAM Rev, 40, 636-666, (1998) · Zbl 0913.65032
[34] Neumaier A (2001) Introduction to numerical analysis. Cambridge University Press, Cambridge · Zbl 0980.65001
[35] Neumaier, A, OSGA: a fast subgradient algorithm with optimal complexity, Mathem Program, 158, 1-21, (2016) · Zbl 1346.90671
[36] Neumaier A, Azmi B (2016) LMBOPT: a limited memory method for bound-constrained optimization, manuscript, University of Vienna · Zbl 0711.90061
[37] Pardalos, PM; Kovoor, N, An algorithm for a singly constrained class of quadratic programs subject to upper and lower bounds, Math Program, 46, 321-328, (1990) · Zbl 0711.90061
[38] Woo, H; Yun, S, Proximal linearized alternating direction method for multiplicative denoising, SIAM J Sci Comput, 35, 336-358, (2013) · Zbl 1326.94025
[39] Wright, SJ; Nowak, RD; Figueiredo, MAT, Sparse reconstruction by separable approximation, IEEE Trans Signal Process, 57, 2479-2493, (2009) · Zbl 1391.94442
[40] Zhang, J; Morini, B, Solving regularized linear least-squares problems by the alternating direction method with applications to image restoration, Electron Trans Numer Anal, 40, 356-372, (2013) · Zbl 1288.65090
[41] Zhang X, Saha A, Vishwanathan SVN (2011) Lower bounds on rate of convergence of cutting plane methods. In: Advances in neural information processing systems 23 · Zbl 1391.94442
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.