zbMATH — the first resource for mathematics

A robust hybrid of lasso and ridge regression. (English) Zbl 1134.62047
Verducci, Joseph Stephen (ed.) et al., Prediction and discovery. AMS-IMS-SIAM joint summer research conference on machine and statistical learning: prediction and discovery, Snowbird, UT, USA, June 25–29, 2006. Contemporary Mathematics 443 (ISBN 978-0-8218-4195-2/pbk). 59-71 (2007).
Summary: Ridge regression and the lasso are regularized versions of least squares regression using \(L_2\) and \(L_1\) penalties, respectively, on the coefficient vector. To make these regressions more robust we may replace least squares with P. J. Huber’s criterion [“Robust statistics.” New York etc.: Wiley and Sons (1981; Zbl 0536.62025)] which is a hybrid of squared error (for relatively small errors) and absolute error (for relatively large ones). A reversed version of Huber’s criterion can be used as a hybrid penalty function. Relatively small coefficients contribute their \(L_1\) norm to this penalty while larger ones cause it to grow quadratically. This hybrid sets some coefficients to 0 (as lasso does) while shrinking the larger coefficients the way ridge regression does. Both the Huber and reversed Huber penalty functions employ a scale parameter. We provide an objective function that is jointly convex in the regression coefficient vector and these two scale parameters.
For the entire collection see [Zbl 1123.62002].

62J07 Ridge regression; shrinkage estimators (Lasso)
90C90 Applications of mathematical programming
62F35 Robustness and adaptive procedures (parametric inference)
90C25 Convex programming
PDF BibTeX Cite