×

A hierarchical model for test-cost-sensitive decision systems. (English) Zbl 1192.68651

Summary: Cost-sensitive learning is an important issue in both data mining and machine learning, in that it deals with the problem of learning from decision systems relative to a variety of costs. In this paper, we introduce a hierarchy of cost-sensitive decision systems from a test cost perspective. Two major issues are addressed with regard to test cost dependency. The first is concerned with the common test cost, where a group of tests share a common cost, while the other relates to the sequence-dependent test cost, where the order of the test sequence influences the total cost. Theoretical aspects of each of the six models in our hierarchy are investigated and illustrated via examples. The proposed models are shown to be useful for exploring cost related information in various different applications.

MSC:

68T20 Problem solving in the context of artificial intelligence (heuristics, search strategies, etc.)
68T05 Learning and adaptive systems in artificial intelligence

Software:

AdaCost
PDF BibTeX XML Cite
Full Text: DOI

References:

[1] Ardagna, D.; Francalanci, C.; Trubian, M., A multi-model algorithm for the cost-oriented design of Internet-based systems, Information sciences, 176, 310-3131, (2006)
[2] Berzal, F.; Cubero, J.-C.; Marì, N.; Sánchez, D., Building multi-way decision trees with numerical attributes, Information sciences, 165, 73-95, (2004) · Zbl 1089.68087
[3] X. Chai, L. Deng, Q. Yang, C.X. Ling, Test-cost sensitive Naïve Bayes classification, in: ICDM 2004, 2004, pp. 51-58.
[4] Chan, C.C.; Grzymała-Busse, J.W., On the two local inductive algorithms: PRISM and LEM2, Foundations of computing and decision sciences, 19, 185-203, (1994) · Zbl 0939.68756
[5] J. Du, Z. Cai, C.X. Ling, Cost-sensitive decision trees with pre-pruning, in: Z. Kobti, D. Wu (Eds.), Canadian AI 2007, LNAI 4509, 2007, pp. 171-179.
[6] C. Elkan, The foundations of cost-sensitive learning, in: Proceedings of the 17th International Joint Conference on Artificial Intelligence (IJCAI’01), 2001, pp. 973-978.
[7] W. Fan, S. Stolfo, J. Zhang, P. Chan, Adacost: misclassification cost-sensitive boosting, in: Proceedings of the 16th International Conference on Machine Learning, 1999, pp. 97-105.
[8] ()
[9] Jensen, R.; Shen, Q., Semantics-preserving dimensionality reduction: rough and fuzzy-rough-based approaches, IEEE transactions on knowledge and data engineering, 16, 12, 1457-1471, (2004)
[10] Ji, S.; Carin, L., Cost-sensitive feature acquisition and classification, Pattern recognition, 40, 1474-1485, (2007) · Zbl 1113.68085
[11] Ling, C.X.; Yang, Q.; Wang, J.; Zhang, S., Decision trees with minimal costs, (), 69
[12] Ling, C.X.; Sheng, V.S.; Yang, Q., Test strategies for cost-sensitive decision trees, IEEE transactions on knowledge and data engineering, 18, 8, 1055-1067, (2006)
[13] Looney, C.G., Fuzzy Petri nets for rule-based decision-making, IEEE transactions on systems, man and cybernetics, 18, 178-183, (1988)
[14] Malowiecki, V.S.; Ling, C.X.; Ni, A.; Zhang, S., Cost-sensitive test strategies, ()
[15] Mi, J.; Wu, W.; Zhang, W., Approaches to knowledge reduction based on variable precision rough set model, Information sciences, 159, 255-272, (2004) · Zbl 1076.68089
[16] Nún‘ez, M., The use of background knowledge in decision tree induction, Machine learning, 6, 231-250, (1991)
[17] Park, C.-M.; Whang, K.-Y.; Lee, J.-J.; Song, I.-Y., A cost-based buffer replacement algorithm for object-oriented database systems, Information sciences, 138, 99-117, (2001) · Zbl 1005.68891
[18] Pawlak, Z.; Skowron, A., Rough sets and Boolean reasoning, Information sciences, 177, 1, 41-73, (2007) · Zbl 1142.68551
[19] Pei, Z.; Resconi, G.; Wal, A.J.V.D.; Qin, K.; Xu, Y., Interpreting and extracting fuzzy decision rules from fuzzy information systems and their inference, Information sciences, 176, 1869-1897, (2006) · Zbl 1100.68114
[20] Quinlan, J.R., Induction of decision trees, Machine learning, 1, 81-106, (1986)
[21] ()
[22] Skowron, A.; Rauszer, C., The discernibility matrices and functions in information systems, (), 331-362
[23] Sun, Y.; Wong, A.K.C.; Wang, Y., Parameter inference of cost-sensitive boosting algorithms, (), 21-30
[24] Tan, M., Cost-sensitive learning of classification knowledge and its applications in robotics, Machine learning, 13, 7-33, (1993)
[25] Tsakonas, A., A comparison of classification accuracy of four genetic programming-evolved intelligent structures, Information sciences, 176, 691-724, (2006)
[26] Tsumoto, S., Automated extraction of medical expert system rules from clinical databases based on rough set theory, Intelligent data analysis, 2, 215-227, (1998)
[27] Turney, P.D., Cost-sensitive classification: empirical evaluation of a hybrid genetic decision tree induction algorithm, Journal of artificial intelligence research, 2, 369-409, (1995)
[28] P.D. Turney, Types of cost in inductive concept learning, in: Proceedings of the Workshop on Cost-Sensitive Learning at the 17th ICML, California, 2000.
[29] Xu, C.; Min, F., Weighted reduction for decision tables, (), 246-255
[30] Yang, Q.; Ling, C.X.; Chai, X.; Pan, R., Test-cost sensitive classification on data with missing values, IEEE transactions on knowledge and data engineering, 18, 5, 626-638, (2006)
[31] Yao, Y.Y., A partition model of granular computing, Transactions on rough sets I, 239-259, (2004) · Zbl 0949.68144
[32] Y.Y. Yao, Y. Zhao, J. Wang, S.Q. Han, A model of machine learning based on user preference of attributes, in: Proceedings of the 5th International Conference on Rough Sets and Soft Computing (RSCTC’06), 2006, pp. 587-596. · Zbl 1162.68588
[33] Zhou, Z.; Liu, X., Training cost-sensitive neural networks with methods addressing the class imbalance problem, IEEE transactions on knowledge and data engineering, 18, 1, 63-77, (2006)
[34] X. Zhu, X. Wu, T.M. Khoshgoftaar, S. Yong, An empirical study of the noise impact on cost-sensitive learning, in: Proceedings of the 20th International Joint Conference on Artificial Intelligence (IJCAI’07), 2007, pp. 1168-1173.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.