Adaptive simplification of solution for support vector machine.

*(English)*Zbl 1119.68151Summary: SVM has been receiving increasing interest in areas ranging from its original application in pattern recognition to other applications such as regression estimation due to its remarkable generalization performance. Unfortunately, SVM is currently considerably slower in test phase caused by number of the support vectors, which has been a serious limitation for some applications. To overcome this problem, we proposed an adaptive algorithm named Feature Vectors Selection (FVS) to select the feature vectors from the support vector solutions, which is based on the vector correlation principle and greedy algorithm. Through the adaptive algorithm, the sparsity of solution is improved and the time cost in testing is reduced. To select the number of the feature vectors adaptively by the requirements, the generalization and complexity trade-off can be directly controlled. The computer simulations on regression estimation and pattern recognition show that FVS is a promising algorithm to simplify the solution for support vector machine.

##### MSC:

68T05 | Learning and adaptive systems in artificial intelligence |

68T10 | Pattern recognition, speech recognition |

##### Keywords:

support vector machine; simplification; vector correlation; feature vector; regression estimation; pattern recognition##### Software:

UCI-ml
PDF
BibTeX
XML
Cite

\textit{Q. Li} et al., Pattern Recognition 40, No. 3, 972--980 (2007; Zbl 1119.68151)

Full Text:
DOI

**OpenURL**

##### References:

[1] | Vapnik, V., An overview of statistical learning theory, IEEE trans. neural network, 10, 5, 988-999, (1999) |

[2] | Cortes, C.; Vapnik, V., Support vector networks, Mach. learn., 20, 273-297, (1995) · Zbl 0831.68098 |

[3] | V. Vapnik, Three remarks on support vector machine, in: S.A. Solla, T.K. Leen, K.R. Müller (Eds.), Advances in Neural Computing, vol. 10, 1998, pp. 1299-1319. |

[4] | Vapnik, V., The nature of statistical learning theory, (1995), Springer New York · Zbl 0833.62008 |

[5] | A. Smola, B. Schölkopf, A tutorial on support vector regression, Royal Holloway Col., Univ. London, UK. Neuro Technical Report NC-TR-98-030, 1998. |

[6] | Burges, C.J.C., A tutorial on support vector machines for pattern recognition, Data mining knowledge discovery, 2, 2, 121-167, (1998) |

[7] | M. Schmidt, Identifying speaker with support vector networks, In Interface ’96 Proceedings, Sydney, Australia, 1996. |

[8] | Kim, K.I.; Jung, K.; Park, S.H.; Kim, H.J., Support vector machines for texture classification, IEEE trans. pattern anal. Mach. intell., 24, 11, 1542-1550, (2002) |

[9] | Cao, L.J.; Francis, E.H., Support vector machine with adaptive parameters in financial time series forecasting, IEEE trans. neural networks, 14, 6, 1506-1518, (2003) |

[10] | Vapnik, V.; Golowich, S.E.; Smola, A.J., Support vector method for function approximation, regression estimation and signal processing, Adv. neural inform. process. syst., 9, 281-287, (1996) |

[11] | Chapelle, O.; Haffner, P.; Vapnik, V., Support vector machines for histogram image classification, IEEE trans. neural networks, 5, 10, 1055-1064, (1999) |

[12] | Downs, T.; Gates, K.; Masters, A., Exact simplification of support vector solutions, J. Mach. learning res., 2, 293-297, (2001) · Zbl 1037.68111 |

[13] | C.J.C. Burges. Simplified support vector decision rules. Proceedings 13th International Conference on Machine Learning, Bari, Italy, 1996, pp. 71-77. |

[14] | Burges, C.J.C.; Schoelkopf, B., Improving speed and accuracy of support vector learning machines, Adv. neural inform. process. syst., 9, 375-381, (1997) |

[15] | J. Brank, M. Grobelnik, N. Milic-Frayling, D. Mladenic, Feature selection using linear support vector machines, Microsoft Research Technical Report MSR-TR-2002-63, 12 June 2002. |

[16] | Baudat, G.; Anouar, F., Feature vector selection and projection using kernels, Neurocomputing, 55, 1-2, 21-28, (2003) |

[17] | Burges, C.J.C., Geometry and invariance in kernel based method, (), 86-116 |

[18] | Scholkopf, B.; Smola, A., Learning with kernels, (1999), MIT Press Cambridge |

[19] | T. Graepel, R. Herbrich, J. Shawe-Taylor, Generalisation error bounds for sparse linear classiers, in: Proceedings of the Thirteenth Annual Conference on Computational Learning Theory, 2000, pp. 298-303. |

[20] | K.J. Lang, M.J. Witbrock, Learning to tell two spirals apart, in: Proceedings of 1989 Connectionist Models Summer School, 1989, pp. 52-61. |

[21] | C. Blake, C. Merz, UCI repository of machine learning databases. |

This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.