×

zbMATH — the first resource for mathematics

Massive data classification via unconstrained support vector machines. (English) Zbl 1139.90017
Summary: A highly accurate algorithm, based on support vector machines formulated as linear programs [O. L. Mangasarian, in: Advances in large margin classifiers, 135–146, MIT Press, Cambridge, MA (2000); see Zbl 0988.68145; P. S. Bradley and O. L. Mangasarian, in Proceedings of the Fifteenth International Conference on Machine Learning, 82–90, Morgan Kaufmann, San Francisco, CA (1998)], is proposed here as a completely unconstrained minimization problem [O. L. Mangasarian, J. Mach. Learn. Res. 7, 1517–1530 (2006)]. Combined with a chunking procedure [P. S. Bradley and O. L. Mangasarian, Optim. Methods Softw. 13, No. 1, 1–10 (2000; Zbl 0986.90085)], this approach, which requires nothing more complex than a linear equation solver, leads to a simple and accurate method for classifying million-point datasets. Because a 1-norm support vector machine underlies the proposed approach, the method suppresses input space features as well. A state-of-the-art linear programming package (CPLEX, see http://www.ilog.com/products/cplex/) fails to solve problems handled by the proposed algorithm.

MSC:
90C08 Special problems of linear programming (transportation, multi-index, data envelopment analysis, etc.)
62H30 Classification and discrimination; cluster analysis (statistical aspects)
68T05 Learning and adaptive systems in artificial intelligence
Software:
CPLEX; Matlab; MINOS
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Mangasarian, O. L., Generalized Support Vector Machines, Advances in Large Margin Classifiers, Edited by A. Smola, P. Bartlett, B. Schölkopf, and D. Schuurmans, MIT Press, Cambridge, Massachusetts, pp. 135–146, 2000; see website ftp://ftp.cs.wisc.edu/math-prog/tech-reports/98-14.ps.
[2] Bradley, P. S., and Mangasarian, O. L., Feature Selection via Concave Minimization and Support Vector Machines, Machine Learning. Proceedings of the 15th International Conference (ICML ’98), Edited by J. Shavlik, Morgan Kaufmann, San Francisco, California, pp. 82–90, 1998; see website ftp://ftp.cs.wisc.edu/math-prog/tech-reports/98-03.ps .
[3] Mangasarian, O. L., Exact 1-Norm Support Vector Machines via Unconstrained Convex Differentiable Minimization, Report 05-03, Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, 2005; see website ftp://ftp.cs.wisc.edu/pub/dmi/tech-reports/05-03.ps . Journal of Machine Learning Research, Vol 7, pp. 1517–1530, 2006. · Zbl 1211.68329
[4] Bradley, P. S., and Mangasarian, O. L., Massive Data Discrimination via Linear Support Vector Machines, Optimization Methods and Software, Vol. 13, pp. 1–10, 2000; see website ftp://ftp.cs.wisc.edu/pub/math-prog/tech-reports/98-03.ps . · Zbl 0986.90085 · doi:10.1080/10556780008805771
[5] ILOG, Incline Village, Nevada, ILOG CPLEX 9.0 User’s Manual, 2003; see website http://www.ilog.com/products/cplex/ .
[6] Cristianini, N., and Shawe-Taylor, J., An Introduction to Support Vector Machines, Cambridge University Press, Cambridge, Massachusetts, 2000. · Zbl 0994.68074
[7] Vapnik, V. N., The Nature of Statistical Learning Theory, 2nd Edition, Springer, New York, NY, 2000. · Zbl 0934.62009
[8] Schölkopf, B., and Smola, A., Learning with Kernels, MIT Press, Cambridge, Massachusetts, 2002. · Zbl 1019.68094
[9] Fung, G., and Mangasarian, O. L., A Feature Selection Newton Method for Support Vector Machine Classification, Computational Optimization and Applications, Vol. 28, pp. 185–202, 2004; see website ftp://ftp.cs.wisc.edu/pub/dmi/tech-reports/02-03.ps. · Zbl 1056.90103 · doi:10.1023/B:COAP.0000026884.66338.df
[10] Hiriart-Urruty, J. B., Strodiot, J. J., and Nguyen, V. H., Generalized Hessian Matrix and Second-Order Optimality Conditions for Problems with C L1 Data, Applied Mathematics and Optimization, Vol. 11, pp. 43–56, 1984. · Zbl 0542.49011 · doi:10.1007/BF01442169
[11] Facchinei, F., Minimization of SC1 Functions and the Maratos Effect, Operations Research Letters, Vol. 17, pp. 131–137, 1995. · Zbl 0843.90108 · doi:10.1016/0167-6377(94)00059-F
[12] Mangasarian, O. L., A Finite Newton Method for Classification Problems, Report 01–11, Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, 2001; see website ftp://ftp.cs.wisc.edu/pub/dmi/tech-reports/01-11.ps. See paper in Optimization Methods and Software, Vol. 17, pp. 913–929, 2002.
[13] Mangasarian, O. L., Arbitrary-Norm Separating Plane, Operations Research Letters, Vol. 24, pp. 15–23, 1999; see website ftp://ftp.cs.wisc.edu/math-prog/tech-reports/97-07r.ps. · Zbl 1028.90037 · doi:10.1016/S0167-6377(98)00049-2
[14] Matlab, User’s Guide, The MathWorks, Natick, Massachusetts, 1994–2001; see website http://www.mathworks.com.
[15] Gilmore, P. C., and Gomory, R. E., A Linear Programming Approach to the Cutting Stock Problem, Operations Research, Vol. 9, pp. 849–859, 1961. · Zbl 0096.35501 · doi:10.1287/opre.9.6.849
[16] Dantzig, G. B., and Wolfe, P., Decomposition Principle for Linear Programs, Operations Research, Vol. 8, pp. 101–111, 1960. · Zbl 0093.32806 · doi:10.1287/opre.8.1.101
[17] Chvátal, V., Linear Programming, W. H. Freeman and Company, New York, NY, 1983.
[18] Murty, K, G., Linear Programming, John Wiley and Sons, New York, NY, 1983. · Zbl 0521.90071
[19] Luenberger, D. G., Linear and Nonlinear Programming, 2nd Edition, Addison-Wesley, Reading, Massachusetts, 1984. · Zbl 0571.90051
[20] Melli, G., Synthetic Classification Data Sets (SCDS), Report, School of Computing Science, Simon Fraser University, Burnaby, British Columbia, Canada, 1997; see websites http://fas.sfu.ca/cs/people/GradStudents/melli/SCDS/ and ftp://ftp.cs.wisc.edu/pub/dmi/data/SCDS/ .
[21] Musicant, D. R., NDC: Normally Distributed Clustered Datasets, 1998; see website http://www.cs.wisc.edu/dmi/svm/ndc/ .
[22] Murtagh, B. A., and Saunders, M. A., MINOS 5.0 User’s Guide, Technical Report SOL 83.20, Stanford University, 1983; see version MINOS 5.4 Release Notes, 1992.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.