*(English)*Zbl 1139.62055

Summary: In high-dimensional data analysis, sliced inverse regression (SIR) has proven to be an effective dimension reduction tool and has enjoyed wide applications. The usual SIR, however, cannot work with problems where the number of predictors, $p$, exceeds the sample size, $n$, and can suffer when there is high collinearity among the predictors. In addition, the reduced dimensional space consists of linear combinations of all the original predictors and no variable selection is achieved.

We propose a regularized SIR approach based on the least-squares formulation of SIR. The ${L}_{2}$ regularization is introduced, and an alternating least-squares algorithm is developed, to enable SIR to work with $n<p$ and highly correlated predictors. The ${L}_{1}$ regularization is further introduced to achieve simultaneous reduction estimation and predictor selection. Both simulations and the analysis of a microarray expression data set demonstrate the usefulness of the proposed method.