swMATH ID: 25237
Software Authors: Chen, Tianyi; Curtis, Frank E.; Robinson, Daniel P.
Description: FarRSA for l1-regularized convex optimization: local convergence and numerical experience. FaRSA is a new method for minimizing the sum of a differentiable convex function and an ℓ 1 -norm regularizer. The main features of the method include: (i) an evolving set of indices corresponding to variables that are predicted to be nonzero at a solution; (ii) subproblems that only need to be solved in a reduced space to lessen per-iteration computational costs; (iii) conditions that determine how accurately each subproblem must be solved, which allow conjugate gradient or coordinate descent techniques to be employed; (iv) a computationally practical condition that dictates when the subspace explored by the current subproblem should be updated; and (v) a reduced proximal gradient step that ensures a sufficient decrease in the objective function when it is decided that the index set that holds the nonzero variables should be expanded. We prove global convergence of the method and demonstrate its performance on a set of model prediction problems with a Matlab implementation. Here, we introduce an enhanced subproblem termination condition that allows us to prove that the iterates converge locally at a superlinear rate. Moreover, we present the details of our publicly available C implementation along with extensive numerical comparisons to other state-of-the-art solvers.
Homepage: https://www.tandfonline.com/doi/abs/10.1080/10556788.2017.1415336
Keywords: nonlinear optimization; convex optimization; sparse optimization; active-set methods; reduced-space methods; subspace minimization; model prediction
Related Software: LIBSVM; Matlab
Cited in: 1 Publication

Citations by Year