×

Fuzzy one-class classification model using contamination neighborhoods. (English) Zbl 1270.68255

Summary: A fuzzy classification model is studied in the paper. It is based on the contaminated (robust) model which produces fuzzy expected risk measures characterizing classification errors. Optimal classification parameters of the models are derived by minimizing the fuzzy expected risk. It is shown that an algorithm for computing the classification parameters is reduced to a set of standard support vector machine tasks with weighted data points. Experimental results with synthetic data illustrate the proposed fuzzy model.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
62H30 Classification and discrimination; cluster analysis (statistical aspects)
62H86 Multivariate analysis and fuzziness

Software:

UCI-ml
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Y. Hao, Z. Chi, D. Yan, and X. Yue, “An improved fuzzy support vector machine for credit rating,” in Network and Parallel Computing, K. Li, C. Jesshope, H. Jin, and J.-L. Gaudiot, Eds., vol. 4672 of Lecture Notes in Computer Science, pp. 495-505, Springer, Berlin, Germany, 2007.
[2] C. F. Lin and S. D. Wang, “Fuzzy support vector machines,” IEEE Transactions on Neural Networks, vol. 13, no. 2, pp. 464-471, 2002. · doi:10.1109/72.991432
[3] C. F. Lin and S. D. Wang, “Training algorithms for fuzzy support vector machines with noisy data,” Pattern Recognition Letters, vol. 25, no. 14, pp. 1647-1656, 2004. · doi:10.1016/j.patrec.2004.06.009
[4] C. F. Lin and S. D. Wang, “Fuzzy support vector machines with automatic membership setting,” in Support Vector Machines: Theory and ApplicationsStudies in Fuzziness and Soft Computing, L. Wang, Ed., vol. 177 of Studies in Fuzziness and Soft Computing, pp. 629-629, Springer, Berlin, Germany, 2005.
[5] W. M. Tang, “Fuzzy SVM with a new fuzzy membership function to solve the two-class problems,” Neural Processing Letters, vol. 34, pp. 209-219, 2011.
[6] V. Vapnik, Statistical Learning Theory, John Wiley & Sons, New York, NY, USA, 1998. · Zbl 0935.62007
[7] P. Y. Hao, “Fuzzy one-class support vector machines,” Fuzzy Sets and Systems, vol. 159, no. 18, pp. 2317-2336, 2008. · Zbl 1186.68367 · doi:10.1016/j.fss.2008.01.013
[8] X. Jiang, Z. Yi, and J. C. Lv, “Fuzzy SVM with a new fuzzy membership function,” Neural Computing and Applications, vol. 15, no. 3-4, pp. 268-276, 2006. · Zbl 05074986 · doi:10.1007/s00521-006-0028-z
[9] R. Burduk, “Probability error in global optimal hierarchical classifier with intuitionistic fuzzy observations,” in Hybrid Artificial Intelligence Systems, E. Corchado, X. Wu, E. Oja, A. Herrero, and B. Baruque, Eds., vol. 5572 of Lecture Notes in Computer Science, pp. 533-540, Springer, Berlin, Germany, 2009.
[10] R. Burduk, “Classification error in Bayes multistage recognition task with fuzzy observations,” Pattern Analysis and Applications, vol. 13, no. 1, pp. 85-91, 2010. · Zbl 1422.68210 · doi:10.1007/s10044-008-0143-9
[11] L. V. Utkin and F. P. A. Coolen, “Interval-valued regression and classifiocation models in the framework of machine learning,” in Proceedings of the 7th International Symposium on Imprecise Probability: Theories and Applications (ISIPTA ’11), F. Coolen, G. de Cooman, Th. Fetz, and M. Oberguggenberger, Eds., pp. 371-380, Innsbruck, Austria, 2011.
[12] T. Wilk and M. Wozniak, “Soft computing methods applied to combination of one-class classifiers,” Neurocomputing, vol. 75, no. 1, pp. 185-193, 2012.
[13] P. J. Huber, Robust Statistics., John Wiley & Sons, New York, NY, USA, 1981. · Zbl 0536.62025
[14] C. Campbell and K. P. Bennett, “A linear programming approach to novelty detection,” in Advances in Neural Information Processing Systems, T. K. Leen, T. G. Dietterich, and V. Tresp, Eds., vol. 13, pp. 395-401, MIT Press, 2001.
[15] V. Cherkassky and F. M. Mulier, Learning From Data: Concepts, Theory, and Methods, Wiley-IEEE Press, 2007. · Zbl 1130.62002
[16] B. Schölkopf, R. Williamson, A. Smola, J. Shawe-Taylor, and J. Platt, “Support vector method for novelty detection,” in Advances in Neural Information Processing Systems, pp. 526-532, 2000.
[17] D. M. J. Tax and R. P. W. Duin, “Support vector domain description,” Pattern Recognition Letters, vol. 20, no. 11-13, pp. 1191-1199, 1999. · doi:10.1016/S0167-8655(99)00087-2
[18] B. Schölkopf, J. C. Platt, J. Shawe-Taylor, A. J. Smola, and R. C. Williamson, “Estimating the support of a high-dimensional distribution,” Neural Computation, vol. 13, no. 7, pp. 1443-1471, 2001. · Zbl 1009.62029 · doi:10.1162/089976601750264965
[19] B. Schölkopf and A. J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond, MIT Press, Cambridge, Mass, USA, 2002.
[20] H. Xu, C. Caramanis, and S. Mannor, “Robustness and regularization of support vector machines,” Journal of Machine Learning Research, vol. 10, pp. 1485-1510, 2009. · Zbl 1235.68209
[21] A. Ben-Tal, L. E. Ghaoui, and A. Nemirovski, Robust Optimization, Princeton University Press, Princeton, NJ, USA, 2009. · Zbl 1221.90001
[22] J. Bi and T. Zhang, “Support vector classification with input data uncertainty,” in Advances in Neural Information Processing Systems, L. K. Saul, Y. Weiss, and L. Bottou, Eds., vol. 17, pp. 161-168, MIT Press, Cambridge, Mass, USA, 2004.
[23] F. Provost and T. Fawcett, “Robust classification for imprecise environments,” Machine Learning, vol. 42, no. 3, pp. 203-231, 2001. · Zbl 0969.68126 · doi:10.1023/A:1007601015854
[24] L. Xu, K. Crammer, and D. Schuurmans, “Robust support vector machine training via convex outlier ablation,” in Proceedings of the 21st National Conference on Artificial Intelligence (AAAI ’06), pp. 536-542, MIT Press, Boston, Mass, USA, July 2006.
[25] G. R. G. Lanckriet, L. El Ghaoui, C. Bhattacharyya, and M. I. Jordan, “A robust minimax approach to classification,” Journal of Machine Learning Research, vol. 3, no. 3, pp. 555-582, 2003. · Zbl 1084.68657 · doi:10.1162/153244303321897726
[26] J. O. Berger, Statistical Decision Theory and Bayesian Analysis, Springer, New York, NY, USA, 1985. · Zbl 0572.62008
[27] L. V. Utkin and Y. A. Zhuk, “Fuzzy decision making using the imprecise Dirichlet model,” International Journal of Mathematics in Operational Research, vol. 5, no. 1, pp. 74-90, 2013. · Zbl 1390.90361
[28] L. M. Campos and A. Gonzalez, “A subjective approach for ranking fuzzy numbers,” Fuzzy Sets and Systems, vol. 29, no. 2, pp. 145-153, 1989. · Zbl 0672.90001 · doi:10.1016/0165-0114(89)90188-7
[29] J. Schubert, “On \rho in a decision-theoretic apparatus of Dempster-Shafer theory,” International Journal of Approximate Reasoning, vol. 13, no. 3, pp. 185-200, 1995. · Zbl 0949.91503 · doi:10.1016/0888-613X(95)00061-K
[30] J. L. Hodges and E. Lehmann, “The use of previous experience in reaching statistical decisions,” The Annals of Mathematical Statistics, vol. 23, no. 3, pp. 396-407, 1952. · Zbl 0047.38306 · doi:10.1214/aoms/1177729384
[31] A. N. Tikhonov and V. Y. Arsenin, Solution of Ill-Posed Problems, W.H. Winston, Washington, DC, USA, 1977. · Zbl 0354.65028
[32] T. Evgeniou, T. Poggio, M. Pontil, and A. Verri, “Regularization and statistical learning theory for data analysis,” Computational Statistics and Data Analysis, vol. 38, no. 4, pp. 421-432, 2002. · Zbl 1072.62642 · doi:10.1016/S0167-9473(01)00069-X
[33] A. Frank and A. Asuncion, UCI Machine Learning Repository, 2010.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.