# zbMATH — the first resource for mathematics

Optimal reduced-set vectors for support vector machines with a quadratic kernel. (English) Zbl 1089.68113
Summary: To reduce computational cost, the discriminant function of a Support Vector Machine (SVM) should be represented using as few vectors as possible. This problem has been tackled in different ways. In this article, we develop an explicit solution in the case of a general quadratic kernel $$k(x,x')=(C+Dx^\top x')^2$$. For a given number of vectors, this solution provides the best possible approximation and can even recover the discriminant function if the number of used vectors is large enough. The key idea is to express the inhomogeneous kernel as a homogeneous kernel on a space having one dimension more than the original one and to follow the approach of C. Burges [“Simplified support vector decision rules”, in: L. Sartta (ed.), 13th International Conference on Machine Learning, 71–77 (1996)].

##### MSC:
 68T05 Learning and adaptive systems in artificial intelligence
##### Keywords:
computational cost
Full Text:
##### References:
 [1] DOI: 10.1162/15324430152748236 · Zbl 0997.68109
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.