##
**Statistical learning with sparsity. The Lasso and generalizations.**
*(English)*
Zbl 1319.68003

Monographs on Statistics and Applied Probability 143. Boca Raton, FL: CRC Press (ISBN 978-1-4987-1216-3/hbk; 978-1-4987-1217-0/ebook). xv, 351 p. (2015).

This monograph deals with statistical learning with sparsity. The authors study and analyze methods using the sparsity property of some statistical models in order to recover the underlying signal in a dataset. They focus on the Lasso technique [R. Tibshirani, J. R. Stat. Soc., Ser. B 58, No. 1, 267–288 (1996; Zbl 0850.62538)], as an alternative to the standard least-squares method. The Lasso estimator is applied to linear models and generalized linear models, and some generalizations of the Lasso penalty are then considered. Within the framework of the optimization methods, the particular relevance of the Lasso is underlined. In addition, statistical inference, matrix decomposition, sparse multivariate methods, and graph selection are also reviewed. The sparsity in signal representation and approximation is approached, together with the utilization of \(l_1\)-methods for capitalizing the sparsity in solving problems regarding different aspects of signals. The book ends with some theoretical results regarding the behavior of the Lasso, being intended to those working with big data wanting to use the sparsity assumption for extracting useful information from big datasets.

Reviewer: Florin Gorunescu (Craiova)

### MSC:

68-02 | Research exposition (monographs, survey articles) pertaining to computer science |

62-02 | Research exposition (monographs, survey articles) pertaining to statistics |

62H12 | Estimation in multivariate analysis |

62J05 | Linear regression; mixed models |

62J07 | Ridge regression; shrinkage estimators (Lasso) |

68T05 | Learning and adaptive systems in artificial intelligence |