Semi-analytic resampling in Lasso. (English) Zbl 07064050

Summary: An approximate method for conducting resampling in Lasso, the \(\ell_1\) penalized linear regression, in a semi-analytic manner is developed, whereby the average over the resampled datasets is directly computed without repeated numerical sampling, thus enabling an inference free of the statistical fluctuations due to sampling finiteness, as well as a significant reduction of computational time. The proposed method is based on a message passing type algorithm, and its fast convergence is guaranteed by the state evolution analysis, when covariates are provided as zero-mean independently and identically distributed Gaussian random variables. It is employed to implement bootstrapped Lasso (Bolasso) and stability selection, both of which are variable selection methods using resampling in conjunction with Lasso, and resolves their disadvantage regarding computational cost. To examine approximation accuracy and efficiency, numerical experiments were carried out using simulated datasets. Moreover, an application to a real-world dataset, the wine quality dataset, is presented. To process such real-world datasets, an objective criterion for determining the relevance of selected variables is also introduced by the addition of noise variables and resampling. MATLAB codes implementing the proposed method are distributed in [T. Obuchi, Matlab package of AMPR. (2018), https://github.com/T-Obuchi/AMPR_lasso_matlab].


62G09 Nonparametric statistical resampling methods
62J07 Ridge regression; shrinkage estimators (Lasso)


Matlab; AMPR; UCI-ml; Bolasso
Full Text: arXiv Link


[1] Francis R Bach. Bolasso: model consistent lasso estimation through the bootstrap. In Proceedings of the 25th international conference on Machine learning, pages 33-40. ACM, 2008.
[2] Onureena Banerjee, Laurent El Ghaoui, Alexandre d’Aspremont, and Georges Natsoulis. Convex optimization techniques for fitting sparse gaussian graphical models. InProceedings of the 23rd international conference on Machine learning, pages 89-96. ACM, 2006.
[3] Jean Barbier, Nicolas Macris, Mohamad Dia, and Florent Krzakala. Mutual information and optimality of approximate message-passing in random linear estimation.arXiv preprint arXiv:1701.05823, 2017.
[4] Mohsen Bayati and Andrea Montanari. The dynamics of message passing on dense graphs, with applications to compressed sensing.IEEE Transactions on Information Theory, 57 (2):764-785, 2011. · Zbl 1366.94079
[5] Francesco Caltagirone, Lenka Zdeborov´a, and Florent Krzakala. On convergence of approximate message passing. In2014 IEEE International Symposium on Information Theory, pages 1812-1816, June 2014. doi: 10.1109/ISIT.2014.6875146.
[6] Burak C¸ akmak, Ole Winther, and Bernard H Fleury. S-amp: Approximate message passing for general matrix ensembles. InInformation Theory Workshop (ITW), 2014 IEEE, pages 192-196. IEEE, 2014.
[7] Javier Cespedes, Pablo M Olmos, Matilde S´anchez-Fern´andez, and Fernando Perez-Cruz. Expectation propagation detection for high-order high-dimensional mimo systems.IEEE Transactions on Communications, 62(8):2840-2849, 2014.
[8] Paulo Cortez, Ant´onio Cerdeira, Fernando Almeida, Telmo Matos, and Jos´e Reis. Modeling wine preferences by data mining from physicochemical properties.Decision Support Systems, 47(4):547-553, 2009.
[9] David L Donoho, Arian Maleki, and Andrea Montanari. Message-passing algorithms for compressed sensing.Proceedings of the National Academy of Sciences, 106(45):18914- 18919, 2009.
[10] Viktor Dotsenko.Introduction to the replica theory of disordered statistical systems, volume 4. Cambridge University Press, 2005. · Zbl 0999.82001
[11] Bradley Efron and Robert J Tibshirani.An introduction to the bootstrap. CRC press, 1994. · Zbl 0835.62038
[12] Jerome Friedman, Trevor Hastie, and Robert Tibshirani. Sparse inverse covariance estimation with the graphical lasso.Biostatistics, 9(3):432-441, 2008. · Zbl 1143.62076
[13] Jerome Friedman, Trevor Hastie, and Rob Tibshirani. Regularization paths for generalized linear models via coordinate descent.Journal of statistical software, 33(1):1, 2010. · Zbl 1109.62302
[14] Max Grazier G’Sell, Jonathan Taylor, and Robert Tibshirani. Adaptive testing for the graphical lasso.arXiv preprint arXiv:1307.4765, 2013.
[15] Edwin Hewitt and Leonard J Savage. Symmetric measures on cartesian products.Transactions of the American Mathematical Society, 80(2):470-501, 1955. · Zbl 0066.29604
[16] Adel Javanmard and Andrea Montanari. Confidence intervals and hypothesis testing for high-dimensional regression.The Journal of Machine Learning Research, 15(1):2869- 2909, 2014a. · Zbl 1319.62145
[17] Adel Javanmard and Andrea Montanari. Hypothesis testing in high-dimensional regression under the gaussian random design model: Asymptotic theory.IEEE Transactions on Information Theory, 60(10):6522-6554, 2014b. · Zbl 1360.62074
[18] Adel Javanmard and Andrea Montanari. De-biasing the lasso: Optimal sample size for gaussian designs.arXiv preprint arXiv:1508.02757, 2015. · Zbl 1407.62270
[19] Yoshiyuki Kabashima. A cdma multiuser detection algorithm on the basis of belief propagation.Journal of Physics A: Mathematical and General, 36(43):11111, 2003. · Zbl 1081.94509
[20] Yoshiyuki Kabashima and Mikko Vehkapera. Signal recovery using expectation consistent approximation for linear observations. InInformation Theory (ISIT), 2014 IEEE International Symposium on, pages 226-230. IEEE, 2014.
[21] Keith Knight and Wenjiang Fu. Asymptotics for lasso-type estimators.Annals of statistics, pages 1356-1378, 2000. · Zbl 1105.62357
[22] M. Lichman. UCI machine learning repository, 2013. URLhttp://archive.ics.uci.edu/ ml.
[23] Richard Lockhart, Jonathan Taylor, Ryan J Tibshirani, and Robert Tibshirani. A significance test for the lasso.Annals of statistics, 42(2):413, 2014. · Zbl 1305.62254
[24] Junjie Ma and Li Ping. Orthogonal amp.IEEE Access, 5:2020-2033, 2017.
[25] D¨orthe Malzahn and Manfred Opper. An approximate analytical approach to resampling averages.Journal of Machine Learning Research, 4(Dec):1151-1173, 2003. · Zbl 1094.68081
[26] Nicolai Meinshausen and Peter B¨uhlmann. Consistent neighbourhood selection for sparse high-dimensional graphs with the lasso. Seminar f¨ur Statistik, Eidgen¨ossische Technische Hochschule (ETH), Z¨urich, 2004.
[27] Nicolai Meinshausen and Peter B¨uhlmann. Stability selection.Journal of the Royal Statistical Society: Series B (Statistical Methodology), 72(4):417-473, 2010. · Zbl 1411.62142
[28] Nicolai Meinshausen and Bin Yu. Lasso-type recovery of sparse representations for highdimensional data.The Annals of Statistics, pages 246-270, 2009. · Zbl 1155.62050
[29] LE Melkumova and S Ya Shatskikh. Comparing ridge and lasso estimators for data analysis. Procedia Engineering, 201:746-755, 2017.
[30] Marc M´ezard, Giorgio Parisi, and Miguel Virasoro.Spin glass theory and beyond: An Introduction to the Replica Method and Its Applications, volume 9.World Scientific Publishing Company, 1987. · Zbl 0992.82500
[31] Balas Kausik Natarajan. Sparse approximate solutions to linear systems.SIAM journal on computing, 24(2):227-234, 1995. · Zbl 0827.68054
[32] Hidetoshi Nishimori.Statistical physics of spin glasses and information processing: an introduction, volume 111. Clarendon Press, 2001. · Zbl 1103.82002
[33] Tomoyuki Obuchi.Matlab package of AMPR.https://github.com/T-Obuchi/AMPR_lasso_matlab, 2018.
[34] Manfred Opper and Ole Winther. Adaptive and self-averaging thouless-anderson-palmer mean-field theory for probabilistic modeling.Physical Review E, 64(5):056131, 2001a. · Zbl 1022.68606
[35] Manfred Opper and Ole Winther. Tractable approximations for probabilistic models: The adaptive thouless-anderson-palmer mean field approach.Physical Review Letters, 86(17): 3695, 2001b. · Zbl 1022.68606
[36] Manfred Opper and Ole Winther. Expectation consistent approximate inference.Journal of Machine Learning Research, 6(Dec):2177-2204, 2005. · Zbl 1222.68278
[37] Sundeep Rangan, Philip Schniter, and Alyson Fletcher. On the convergence of approximate message passing with arbitrary matrices. InInformation Theory (ISIT), 2014 IEEE International Symposium on, pages 236-240. IEEE, 2014. · Zbl 1432.94037
[38] Sundeep Rangan, Philip Schniter, and Alyson Fletcher. Vector approximate message passing.arXiv preprint arXiv:1610.03082, 2016. · Zbl 1432.94036
[39] Hidetoshi Shimodaira et al.Approximately unbiased tests of regions using multistepmultiscale bootstrap resampling.The Annals of Statistics, 32(6):2616-2641, 2004. · Zbl 1078.62045
[40] Takashi Takahashi and Yoshiyuki Kabashima. A statistical mechanics approach to debiasing and uncertainty estimation in lasso for random measurements.Journal of Statistical Mechanics: Theory and Experiment, 2018(7):073405, 2018.
[41] Keigo Takeuchi. Rigorous dynamics of expectation-propagation-based signal recovery from unitarily invariant measurements. InInformation Theory (ISIT), 2017 IEEE International Symposium on, pages 501-505. IEEE, 2017.
[42] Michel Talagrand.Spin glasses: a challenge for mathematicians: cavity and mean field models, volume 46. Springer Science & Business Media, 2003. · Zbl 1033.82002
[43] Robert Tibshirani. Regression shrinkage and selection via the lasso.Journal of the Royal Statistical Society. Series B (Methodological), pages 267-288, 1996. · Zbl 0850.62538
[44] Hastie Trevor, Tibshirani Robert, and Friedman Jerome.The Elements of Statistical Learning; Data Mining, Inference, and Prediction.Springer-Verlag New York, 2009.doi: 10.1007/978-0-387-84858-7. · Zbl 1273.62005
[45] Martin J Wainwright. Sharp thresholds for high-dimensional and noisy sparsity recovery using‘1-constrained quadratic programming (lasso).IEEE transactions on information theory, 55(5):2183-2202, 2009. · Zbl 1367.62220
[46] Ming Yuan and Yi Lin. On the non-negative garrotte estimator.Journal of the Royal Statistical Society: Series B (Statistical Methodology), 69(2):143-161, 2007. · Zbl 1120.62052
[47] Peng Zhao and Bin Yu. On model selection consistency of lasso.Journal of Machine learning research, 7(Nov):2541-2563, 2006. · Zbl 1222.62008
[48] Hui Zou.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.