×

Knowledge reduction based on divide and conquer method in rough set theory. (English) Zbl 1264.68175

Summary: The divide and conquer method is a typical granular computing method using multiple levels of abstraction and granulations. So far, although some achievements based on divided and conquer method in the rough set theory have been acquired, the systematic methods for knowledge reduction based on divide and conquer method are still absent. In this paper, the knowledge reduction approaches based on divide and conquer method, under equivalence relation and under tolerance relation, are presented, respectively. After that, a systematic approach, named as the abstract process for knowledge reduction based on divide and conquer method in rough set theory, is proposed. Based on the presented approach, two algorithms for knowledge reduction, including an algorithm for attribute reduction and an algorithm for attribute value reduction, are presented. Some experimental evaluations are done to test the methods on uci data sets and KDDCUP99 data sets. The experimental results illustrate that the proposed approaches are efficient to process large data sets with good recognition rate, compared with KNN, SVM, C4.5, Naive Bayes, and CART.

MSC:

68T37 Reasoning under uncertainty in the context of artificial intelligence
91B44 Economics of information

Software:

C4.5; LERS
PDF BibTeX XML Cite
Full Text: DOI

References:

[1] A. Bargiela and W. Pedryc, Human-Centric Information Processing Through Granular Modelling, Springer, Berlin, Germany, 1997.
[2] W. Pedrycz, A. Skowron, and V. Kreinovich, Handbook of Granular Computing, Wiley Interscience, New York, NY, USA, 2007.
[3] J. T. Yao, Novel Developments in Granular Computing, Applications for Advanced Human Reasoning and Soft Computation, Information Science Reference, Herskey, Pa, USA, 2010.
[4] J. Yao, “A ten-year review of granular computing,” in Proceedings of the IEEE International Conference on Granular Computing (GRC ’07), pp. 734-739, November 2007.
[5] Y. Y. Yao, “Granular computing: past, present and future,” in Proceedings of the IEEE International Conference on Granular Computing, pp. 80-85, 2008.
[6] Y. Y. Yao and J. G. Luo., “Top-down progressive computing,” in Proceedings of the RSKT, pp. 734-742, Springer, Regina, Canada, 2011.
[7] Z. Pawlak, “Rough sets,” International Journal of Computer and Information Sciences, vol. 11, no. 5, pp. 341-356, 1982. · Zbl 0525.04005
[8] Z. Pawlak and A. Skowron, “Rudiments of rough sets,” Information Sciences, vol. 177, no. 1, pp. 3-27, 2007. · Zbl 1142.68549
[9] Z. Pawlak and A. Skowron, “Rough sets: some extensions,” Information Sciences, vol. 177, no. 1, pp. 28-40, 2007. · Zbl 1142.68550
[10] Z. Pawlak and A. Skowron, “Rough sets and boolean reasoning,” Information Sciences, vol. 177, no. 1, pp. 41-73, 2007. · Zbl 1142.68551
[11] J. W. Grzymala-Busse, “A new version of the rule induction system LERS,” Fundamenta Informaticae, vol. 31, no. 1, pp. 27-39, 1997. · Zbl 0882.68122
[12] F. Hu and G. Y. Wang, “A quick reduction algorithm based on attribute order,” Chinese Journal of Computers, vol. 30, no. 8, pp. 1430-1435, 2007 (Chinese).
[13] F. Hu, G. Wang, and Y. Xia, “Attribute core computing based on divide and conquer method,” in Proceedings of the International Conference on Rough Sets and Intelligent Systems Paradigms (RSEISP ’07), M. Kryszkiewicz, et al., Ed., Lecture Notes in Artificial Intelligence 4585, pp. 310-319, springer, Warsaw, Poland, 2007.
[14] F. Hu, G. Wang, and J. Dai, “Quick discretization algorithm for rough set based on dynamic clustering,” Journal of Southwest Jiaotong University, vol. 45, no. 6, pp. 977-983, 2010 (Chinese). · Zbl 1240.68317
[15] K. Hu, Y. Lu, and C. Shi, “Feature ranking in rough sets,” AI Communications, vol. 16, no. 1, pp. 41-50, 2003. · Zbl 1089.68644
[16] X. Hu, N. Cercone, and N. Cercone, “Learning in relational databases: a rough set approach,” Computational Intelligence, vol. 11, no. 2, pp. 323-338, 1995.
[17] M. Kryszkiewicz and H. Rybinski, “Computation of reducts of composed information systems,” Fundamenta Informaticae, vol. 27, no. 2-3, pp. 183-195, 1996. · Zbl 0854.68097
[18] D. F. Li, G. B. Li, and W. Zhang, “U/a partition based smallest reduction construction,” Journal of Wuhan University, vol. 51, pp. 269-272, 2005 (Chinese). · Zbl 1114.68539
[19] T. Y. Lin and N. Cercone, Eds., Rough Sets and Data Mining-Analysis of Imperfect Data, Kluwer Academic Publishers, Boston, Mass, USA, 1997.
[20] Q.-H. Liu, F. Li, F. Min, M. Ye, and G.-W. Yang, “Efficient knowledge reduction algorithm based on new conditional information entropy,” Control and Decision, vol. 20, no. 8, pp. 878-882, 2005 (Chinese). · Zbl 1115.68519
[21] S. W. Liu, Q.-J. Sheng, B. Wu, Z.-Z. Shi, F. Hu, et al., “Research on efficient algorithms for rough set methods,” Chinese Journal of Computers, vol. 40, pp. 637-642, 2003 (Chinese).
[22] D. Miao, C. Gao, N. Zhang, and Z. Zhang, “Diverse reduct subspaces based co-training for partially labeled data,” International Journal of Approximate Reasoning, vol. 52, no. 8, pp. 1103-1117, 2011. · Zbl 05976300
[23] J. M. Mikhail, P. Marcin, and Z. Beata, “On partial covers, reducts and decision rules with weights,” in Proceedings on Transactions on Rough Sets 6, vol. 4374 of Lecture Notes in Computer Sciences 4374, pp. 211-246, Springer, Berlin, Germany, 2007. · Zbl 1186.68467
[24] R. C. Michal, J. W. Grzymala-Busse, W. P. Neil, and T. Soe, “The rule induction system LERSa new version for personal computers,” in Proceeding of the International Workshop on Rough Sets and Knowledge Discovery (RSKD ’93), Alberta, Canada, 1993. · Zbl 0806.68089
[25] J. M. Moshkov, A. Skowron, and Z. Suraj, “On minimal rule sets for almost all binary information systems,” Fundamenta Informaticae, vol. 80, no. 1-3, pp. 247-258, 2008. · Zbl 1137.68618
[26] H. S. Nguyen, “From optimal hyperplanes to optimal decision trees,” Fundamenta Informaticae, vol. 34, no. 1-2, pp. 145-174, 1998. · Zbl 0903.68161
[27] H. S. Nguyen, “A soft decision tree,” in Proceedings of the Intelligent Information Systems (IIS ’02), M. A. Klopotek, S. Wierzchon, and M. Michalewicz, Eds., Advanced in Soft Computing, pp. 57-66, Springer, Berlin, Germany, 2002. · Zbl 1088.68570
[28] H. S. Nguyen, “Approximate Boolean reasoning: foundations and applications in data mining,” in Transactions on Rough Sets 5, vol. 4100, pp. 334-506, Springer, Berlinm Germany, 2006. · Zbl 1136.68497
[29] H. S. Nguyen, A. Skowron, and P. Synak, “Discovery of data patterns with applications to decomposition and classification problems,” in Rough sets in knowledge discovery 2, L. Polkowski and A. Skowron, Eds., vol. 19, pp. 55-97, Physica, Berlin, Germany, 1998.
[30] S. H. Nguyen and H. S. Nguyen, “Some efficient algorithms for rough set methods,” in Proceedings of the Conference of Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU ’96), pp. 1451-1456, Granada, Spain, 1996.
[31] S. H. Nguyen and H. S. Nguyen, “Pattern extraction from data,” Fundamenta Informaticae, vol. 34, no. 1-2, pp. 129-144, 1998. · Zbl 0903.68054
[32] S. K. Pal, L. Polkowski, and A. Skowron, Rough-Neural Computing: Techniques for Computing with Words, Cognitive Technologies, Springer, Berlin, Germany, 2004. · Zbl 1040.68113
[33] Y. Qian, J. Liang, W. Pedrycz, and C. Dang, “An efficient accelerator for attribute reduction from incomplete data in rough set framework,” Pattern Recognition, vol. 44, no. 8, pp. 1658-1670, 2011. · Zbl 1218.68152
[34] A. Skowron and C. Rauszer, “The discernibility functions matrics and functions in information systems,” in Intelligent Decision Support-Handbook of Applications and Advances of the Rough Sets Theory, R. Slowinski, Ed., pp. 331-362, Kluwer Academic Publisher, Dordrecht, The Netherlands, 1992.
[35] A. Skowron, Z. Pawlak, J. Komorowski, and L. Polkowski, “A rough set perspective on data and knowledge,” in Handbook of KDD, W. Kloesgen and J. Zytkow, Eds., pp. 134-149, Oxford University Press, Oxford, UK, 2002.
[36] G. Y. Wang, Rough Set Theory and Knowledge Acquisition, Xi’an Jiaotong University Press, 2001.
[37] G. Y. Wang, H. Yu, and D. C. Yang, “Decision table reduction based on conditional information entropy,” Chinese Journal of Computers, vol. 25, no. 7, pp. 759-766, 2002 (Chinese).
[38] Jue Wang and Ju Wang, “Reduction algorithms based on discernibility matrix: the ordered attributes method,” Journal of Computer Science and Technology, vol. 16, no. 6, pp. 489-504, 2001. · Zbl 1014.68160
[39] Y. Yao and Y. Zhao, “Discernibility matrix simplification for constructing attribute reducts,” Information Sciences, vol. 179, no. 7, pp. 867-882, 2009. · Zbl 1162.68704
[40] H.-Z. Yang, L. Yee, and M.-W. Shao, “Rule acquisition and attribute reduction in real decision formal contexts,” Soft Computing, vol. 15, no. 6, pp. 1115-1128, 2011. · Zbl 1237.68218
[41] M. Zhao, The data description based on reduct [Ph.D. dissertation], Institute of Automation, Chinese Academy of Sciences, Bejing, China, 2004.
[42] J. Zhou, D. Miao, W. Pedrycz, and H. Zhang, “Analysis of alternative objective functions for attribute reduction in complete decision tables,” Soft Computing, vol. 15, no. 8, pp. 1601-1616, 2011. · Zbl 1237.68227
[43] W. Ziarko, N. Cerone, and X. Hu, “Rule discovery from database with decision matrices,” in Proceedings of the 9th International Symposium on Foundation of Intelligent Systems (ISMIS ’96), pp. 653-662, Zakopane, Poland, May 1996.
[44] X. Wu, V. P. Kumar, R. S. Quinlan et al., et al., “Top 10 algorithms in data mining,” Knowledge and Information Systems, vol. 14, no. 1, pp. 1-37, 2008. · Zbl 05348912
[45] F. Hu and G. Y. Wang, “Analysis of the complexity of quick sort for two-dimensional tables,” Chinese Journal of Computers, vol. 30, no. 6, pp. 963-968, 2007 (Chinese).
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.