×

zbMATH — the first resource for mathematics

Comprehensible credit scoring models using rule extraction from support vector machines. (English) Zbl 1278.91177
Summary: In recent years, support vector machines (SVMs) were successfully applied to a wide range of applications. However, since the classifier is described as a complex mathematical function, it is rather incomprehensible for humans. This opacity property prevents them from being used in many real-life applications where both accuracy and comprehensibility are required, such as medical diagnosis and credit risk evaluation. To overcome this limitation, rules can be extracted from the trained SVM that are interpretable by humans and keep as much of the accuracy of the SVM as possible. In this paper, we will provide an overview of the recently proposed rule extraction techniques for SVMs and introduce two others taken from the artificial neural networks domain, being Trepan and G-REX. The described techniques are compared using publicly available datasets, such as Ripley’s synthetic dataset and the multi-class iris dataset. We will also look at medical diagnosis and credit scoring where comprehensibility is a key requirement and even a regulatory recommendation. Our experiments show that the SVM rule extraction techniques lose only a small percentage in performance compared to SVMs and therefore rank at the top of comprehensible classification techniques.

MSC:
91G40 Credit risk
62H30 Classification and discrimination; cluster analysis (statistical aspects)
Software:
C4.5
PDF BibTeX Cite
Full Text: DOI
References:
[1] Andrews, R.; Diederich, J.; Tickle, A.B., Survey and critique of techniques for extracting rules from trained artificial neural networks, Knowledge-based systems, 8, 6, 373-389, (1995)
[2] Baesens, B.; Van Gestel, T.; Viaene, S.; Stepanova, M.; Suykens, J.; Vanthienen, J., Benchmarking state-of-the-art classification algorithms for credit scoring, Journal of the operational research society, 54, 6, 627-635, (2003) · Zbl 1097.91516
[3] Baesens, B.; Setiono, R.; Mues, C.; Vanthienen, J., Using neural network rule extraction and decision tables for credit-risk evaluation, Management science, 49, 3, 312-329, (2003) · Zbl 1232.91684
[4] N. Barakat, J. Diederich, Learning-based rule-extraction from support vector machines. In: 14th International Conference on Computer Theory and Applications ICCTA 2004 Proceedings, Alexandria, Egypt, 2004.
[5] Browne, A.; Hudson, B.; Whitley, D.; Picton, P., Biological data mining with neural networks: implementation and application of a flexible decision tree extraction algorithm to genomic problem domains, Neurocomputing, 57, 275-293, (2004)
[6] M.W. Craven. Extracting comprehensible models from trained neural networks. Ph.D. thesis, University of Winsconsin-Madison, 1996. Supervisor-J.W. Shavlik.
[7] Craven, M.W.; Shavlik, J.W., Extracting tree-structured representations of trained neural networks, Advances in neural information processing systems, 8, 24-30, (1996)
[8] Cristianini, N.; Shawe-Taylor, J., An introduction to support vector machines and other kernel-based learning methods, (2000), Cambridge University Press New York, NY, USA
[9] Drucker, H.; Wu, D.; Vapnik, V., Support vector machines for spam categorization, Ieee-nn, 10, 5, 1048-1054, (1999)
[10] D.W. Dwyer, A.E. Kocagil, R.M. Stein, Moody’s kmv riskcalc v3.1 model, 2004.
[11] Fung, G.; Sandilya, S.; Bharat Rao, R., Rule extraction from linear support vector machines, (), 32-40 · Zbl 1148.68433
[12] Van Gestel, T.; Baesens, B.; Suykens, J.; Van den Poel, D.; Baestaens, D.-E.; Willekens, M., Bayesian kernel based classification for financial distress detection, European journal of operational research, 172, 3, 979-1003, (2006) · Zbl 1111.90330
[13] T. Van Gestel, J.A.K. Suykens, B. Baesens, S. Viaene, J. Vanthienenand G. Dedene, B. De Moor, J. Vandewalle. Benchmarking least squares support vector machine classifiers. CTEO, Technical Report 0037, K.U. Leuven, Belgium, 2000. · Zbl 1078.68737
[14] Van Gestel, T.; Suykens, J.A.K.; Baestaens, D.-E.; Lambrechts, A.; Lanckriet, G.; Vandaele, B.; De Moor, B.; Vandewalle, J., Financial time series prediction using least squares support vector machines with the evidence framework, IEEE transactions on neural networks, 12, 4, 809-821, (2001)
[15] S. Hettich, S.D. Bay. The uci kdd archive, 1996. <http://kdd.ics.uci.edu>.
[16] U. Johansson, R. König, L. Niklasson. The truth is in there – rule extraction from opaque models using genetic programming. In: 17th International Florida AI Research Symposium Conference FLAIRS Proceedings, 2004.
[17] J.T.Yao. Sensitivity analysis for data mining. In: 22nd International Conference of NAFIPS Proceedings, 2003, pp. 272-277.
[18] Koza, John R., Genetic programming: on the programming of computers by natural selection, (1992), MIT Press Cambridge, Mass · Zbl 0850.68161
[19] Lu, C.; Van Gestel, T.; Suykens, J.A.K.; Van Huffel, S.; Vergote, I.; Timmerman, D., Preoperative prediction of malignancy of ovarium tumor using least squares support vector machines, Artificial intelligence in medicine, 28, 3, 281-306, (1999)
[20] Mannino, M.V.; Koushik, M.V., The cost-minimizing inverse classification problem: A genetic algorithm approach, Decision support systems, 29, 3, 283-300, (2000)
[21] H. Nùnez, C. Angulo, A. Catala, Rule extraction from support vector machines. In: European Symposium on Artificial Neural Networks Proceedings, 2002, pp. 107-112.
[22] H. Nùnez, C. Angulo, A. Catala. Rule based learning systems from SVM and RBFNN. Tendencias de la mineria de datos en espana, Red Espaola de Minera de Datos, 2004.
[23] Pochet, N.; De Smet, F.; Suykens, J.A.K.; De Moor, B.L.R., Systematic benchmarking of microarray data classification: assessing the role of non-linearity and dimensionality reduction, Bioinformatics, 20, 17, 3185-3195, (2004)
[24] Quinlan, J.R., Induction of decision trees, Machine learning, 1, 1, 81-106, (1986)
[25] Quinlan, J.R., C4.5 programs for machine learning, (1993), Morgan Kaufman Publishers Inc. San Francisco, CA, USA
[26] Ripley, B.D., Neural networks and related methods for classification, Journal of the royal statistical society B, 56, 409-456, (1994) · Zbl 0815.62037
[27] Silverman, D.W., Density estimation for statistics and data analysis, (1986), Chapman and Hall · Zbl 0617.62042
[28] Vapnik, V.N., The nature of statistical learning theory, (1995), Springer-Verlag New York, Inc., New York, NY, USA · Zbl 0934.62009
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.