##
**Optimizing kernel methods: A unifying variational principle.**
*(English)*
Zbl 0749.62024

Summary: We consider a variety of optimization problems connected with the choice of a kernel function. An example is the optimization of kernels for estimating characteristic points of a curve which are the locations of extrema of higher order derivatives. We discuss the problems of finding “optimal” kernels minimizing the asymptotic mean squared error in this context and that of “minimum variance” kernels minimizing the asymptotic variance. The corresponding variational problems are analyzed by means of Jacobi representations and explicit solutions which are polynomials with compact support are obtained. It is then shown that in fact a variety of other variational problems connected with the choice of optimal kernel functions are equivalent to this problem.

A general underlying variational principle is uncovered and investigated. The limiting case as the order of smoothness of the kernel tends to infinity is studied, leading to analytic kernel functions on \(\mathbb{R}\) for which an explicit Hermite representation is found. The kernels thus obtained provide a natural extension of the optimal kernels with compact support.

A general underlying variational principle is uncovered and investigated. The limiting case as the order of smoothness of the kernel tends to infinity is studied, leading to analytic kernel functions on \(\mathbb{R}\) for which an explicit Hermite representation is found. The kernels thus obtained provide a natural extension of the optimal kernels with compact support.

### MSC:

62G07 | Density estimation |

62G20 | Asymptotic properties of nonparametric inference |

49R50 | Variational methods for eigenvalues of operators (MSC2000) |