##
**Variable selection for support vector machines via smoothing spline ANOVA.**
*(English)*
Zbl 1096.62072

Summary: It is well-known that the support vector machine paradigm is equivalent to solving a regularization problem in a reproducing kernel Hilbert space. The squared norm penalty in the standard support vector machine controls the smoothness of the classification function. We propose, under the framework of smoothing spline ANOVA models, a new type of regularization to conduct simultaneous classification and variable selection in the SVM. The penalty functional used is the sum of functional component norms, which automatically applies soft-thresholding operations to functional components, hence yields sparse solutions. We suggest an efficient algorithm to solve the proposed optimization problem by iteratively solving quadratic and linear programming problems. Numerical studies, on both simulated data and real datasets, show that the modified support vector machine gives very competitive performances compared to other popular classification algorithms, in terms of both classification accuracy and variable selection.

### MSC:

62H30 | Classification and discrimination; cluster analysis (statistical aspects) |

62J10 | Analysis of variance and covariance (ANOVA) |

68T05 | Learning and adaptive systems in artificial intelligence |

65C60 | Computational problems in statistics (MSC2010) |

90C90 | Applications of mathematical programming |

46N30 | Applications of functional analysis in probability theory and statistics |