×

zbMATH — the first resource for mathematics

Learning ELM-tree from big data based on uncertainty reduction. (English) Zbl 1335.68213
Summary: A challenge in big data classification is the design of highly parallelized learning algorithms. One solution to this problem is applying parallel computation to different components of a learning model. In this paper, we first propose an extreme learning machine tree (ELM-Tree) model based on the heuristics of uncertainty reduction. In the ELM-Tree model, information entropy and ambiguity are used as the uncertainty measures for splitting decision tree (DT) nodes. Besides, in order to resolve the over-partitioning problem in the DT induction, ELMs are embedded as the leaf nodes when the gain ratios of all the available splits are smaller than a given threshold. Then, we apply parallel computation to five components of the ELM-Tree model, which effectively reduces the computational time for big data classification. Experimental studies demonstrate the effectiveness of the proposed method.

MSC:
68T05 Learning and adaptive systems in artificial intelligence
68T37 Reasoning under uncertainty in the context of artificial intelligence
Software:
PLANET; UCI-ml; SPRINT ; C4.5
PDF BibTeX Cite
Full Text: DOI
References:
[1] Barakat, M.; Lefebvre, D.; Khalil, M.; Druaux, F.; Mustapha, O., Parameter selection algorithm with self adaptive growing neural network classifier for diagnosis issues, Int. J. Mach. Learn. Cybern., 4, 3, 217-233, (2013)
[2] Ben-Haim, Y.; Tom-Tov, E., A streaming parallel decision tree algorithm, J. Mach. Learn. Res., 11, 849-872, (2010) · Zbl 1242.68204
[3] Chen, C. J., Structural vibration suppression by using neural classifier with genetic algorithm, Int. J. Mach. Learn. Cybern., 3, 3, 215-221, (2012)
[4] Demšar, J., Statistical comparisons of classifiers over multiple data sets, J. Mach. Learn. Res., 7, 1-30, (2006) · Zbl 1222.68184
[5] Dong, X. L.; Srivastava, D., Big data integration, (Proceedings of ICDE’13, (2013)), 1245-1248
[6] Ferrari, S.; Stengel, R. F., Smooth function approximation using neural networks, IEEE Trans. Neural Netw., 16, 1, 24-38, (2005)
[7] Frank, E.; Wang, Y.; Inglis, S.; Holmes, G.; Witten, I. H., Using model trees for classification, Mach. Learn., 32, 1, 63-76, (1998) · Zbl 0901.68167
[8] Friedman, J.; Hastie, T.; Tibshirani, R., Additive logistic regression: a statistical view of boosting, Ann. Stat., 38, 2, 337-374, (2000) · Zbl 1106.62323
[9] Gama, J., Functional trees, Mach. Learn., 55, 3, 219-250, (2004) · Zbl 1078.68699
[10] Gattiker, A.; Gebara, F. H.; Hofstee, H. P.; Hayes, J. D.; Hylick, A., Big data text-oriented benchmark creation for hadoop, IBM J. Res. Dev., 57, 3-4, 10:1-10:6, (2013)
[11] He, Q.; Shang, T. F.; Zhuang, F. Z.; Shi, Z. Z., Parallel extreme learning machine for regression based on mapreduce, Neurocomputing, 102, 52-58, (2013)
[12] Higashi, M.; Klir, G. J., Measures of uncertainty and information based on possibility distributions, Int. J. Gen. Syst., 9, 1, 43-58, (1982) · Zbl 0497.94008
[13] Huang, G. B.; Wang, D. H.; Lan, Y., Extreme learning machines: a survey, Int. J. Mach. Learn. Cybern., 2, 2, 107-122, (2011)
[14] Huang, G. B.; Zhou, H. M.; Ding, X. J.; Zhang, R., Extreme learning machine for regression and multiclass classification, IEEE Trans. Syst. Man Cybern., Part B, Cybern., 42, 2, 513-529, (2012)
[15] Huang, G. B.; Zhu, Q. Y.; Siew, C. K., Extreme learning machine: theory and applications, Neurocomputing, 70, 489-501, (2006)
[16] Kohavi, R., Scaling up the accuracy of naive Bayes classifiers: a decision-tree hybrid, (Proceedings of KDD’96, (1996)), 202-207
[17] Landwehr, N.; Hall, M.; Frank, E., Logistic model trees, Mach. Learn., 59, 1-2, 161-205, (2005) · Zbl 1101.68767
[18] Lomax, S.; Vadera, S., A survey of cost-sensitive decision tree induction algorithms, ACM Comput. Surv., 45, 2, 16:1-16:35, (2013) · Zbl 1293.68232
[19] Malik, P., Governing big data: principles and practices, IBM J. Res. Dev., 57, 3-4, 1:1-1:13, (2013)
[20] Panda, B.; Herbach, J. S.; Basu, S.; Bayardo, R. J., PLANET: massively parallel learning of tree ensembles with mapreduce, Proc. VLDB Endow., 2, 2, 1426-1437, (2009)
[21] Quinlan, J. R., Induction of decision trees, Mach. Learn., 1, 1, 81-106, (1986)
[22] Quinlan, J. R., Improved use of continuous attributes in C4.5, J. Artif. Intell. Res., 4, 77-90, (1996) · Zbl 0900.68112
[23] Shafer, J.; Agrawal, R.; Mehta, M., SPRINT: a scalable parallel classifier for data mining, (Proceedings of VLDB’96, (1996)), 544-555
[24] Sheng, Y.; Phoha, V. V.; Rovnyak, S. M., A parallel decision tree-based method for user authentication based on keystroke patterns, IEEE Trans. Syst. Man Cybern., Part B, Cybern., 35, 4, 826-833, (2005)
[25] Srivastava, A.; Han, E. H.; Kumar, V.; Singh, V., Parallel formulations of decision-tree classification algorithms, Data Min. Knowl. Discov., 3, 3, 237-261, (1999)
[26] Sumner, M.; Frank, E.; Hall, M., Speeding up logistic model tree induction, (Knowledge Discovery in Databases, PKDD 2005, Lect. Notes Comput. Sci., vol. 3721, (2005)), 675-683
[27] UCI Machine Learning Repository, available online:
[28] Wang, X. Z.; He, Y. L.; Wang, D. D., Non-naive Bayesian classifiers for classification problems with continuous attributes, IEEE Trans. Cybern., 44, 1, 21-39, (2014)
[29] Wang, X. Z.; Hong, J. R., On the handling of fuzziness for continuous-valued attributes in decision tree generation, Fuzzy Sets Syst., 99, 3, 283-290, (1998) · Zbl 0937.68106
[30] Wang, T.; Qin, Z. X.; Jin, Z.; Zhang, S. C., Handling over-Fitting in test cost-sensitive decision tree learning by feature selection, smoothing and pruning, J. Syst. Softw., 83, 7, 1137-1147, (2010)
[31] Wang, X. Z.; Yeung, D. S.; Tsang, E. C.C., A comparative study on heuristic algorithms for generating fuzzy decision trees, IEEE Trans. Syst. Man Cybern., Part B, Cybern., 31, 2, 215-226, (2001)
[32] Wang, X. Z.; Zhai, J. H.; Lu, S. X., Induction of multiple fuzzy decision trees based on rough set technique, Inf. Sci., 178, 16, 3188-3202, (2008) · Zbl 1154.68529
[33] Witten, I. H.; Frank, E., Data mining: practical machine learning tools and techniques, (2005), Morgan Kaufmann · Zbl 1076.68555
[34] Yuan, Y. F.; Shaw, M. J., Induction of fuzzy decision tree, Fuzzy Sets Syst., 69, 2, 125-139, (1995)
[35] Zaki, M. J., Parallel and distributed association mining: a survey, IEEE Concurr., 7, 4, 14-25, (1999)
[36] Zheng, S. F., Gradient descent algorithms for quantile regression with smooth approximation, Int. J. Mach. Learn. Cybern., 2, 3, 191-207, (2011)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.