×

Information-extreme method for classification of observations with categorical attributes. (English. Russian original) Zbl 1346.68170

Cybern. Syst. Anal. 52, No. 2, 224-231 (2016); translation from Kibern. Sist. Anal. 2016, No. 2, 56-63 (2016).
Summary: An algorithm is proposed for information-extreme machine learning based on the adaptive coding of multitype primary features used in the recognition and optimization of geometric parameters of partitioning the space of secondary (unified) features into equivalence classes in the iterative approximation of the global maximum of an information criterion to its boundary value.

MSC:

68T10 Pattern recognition, speech recognition
68T05 Learning and adaptive systems in artificial intelligence
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] A. Cerioli, M. Riani, and A. C. Atkinson, “Robust classification with categorical variables,” in: Proc. in Computational Statistics, Physica-Verlag HD, Heidelberg (2006), pp. 507-519. · Zbl 1437.62013
[2] H. Alkharusi, “Categorical variables in regression analysis: A comparison of dummy and effect coding,” International Journal of Education, 4, No. 2, 202-210 (2012).
[3] M.-L. Shyu, I. Kuruppu-Appuhamilage, S.-C. Chen, L. W. Chang, and T. Goldring, “Handling nominal features in anomaly intrusion detection problems,” in: Proc. 15th Workshop on Research Issue in Data Engineering: Stream Data Mining and Applications, IEEE Computer Society, Harvard (2005), pp. 55-62.
[4] A. G. D’yakonov, “Solution methods for classification problems with categorical attributes,” Applied Mathematics and Informatics, No. 46, 81 (2014).
[5] D. R. Wilson and T. R. Martinez, “Improved heterogeneous distance functions,” Journal of Artificial Intelligence Research, 6, 1-34 (1997). · Zbl 0894.68118
[6] D. Ienco, R. G. Pensa, and R. Meo, “Context-based distance learning for categorical data clustering,” Advances in Intelligent Data Analysis VIII; Lecture Notes in Computer Science, 5772, 83-94 (2009).
[7] A. Ye. Yankovskaya, O. G. Berestneva, and E. A. Muratova, “Knowledge elicitation using an algorithm for adaptively coding multitype information,” Artificial Intelligence, No. 2, 315-322 (2002).
[8] N. V. Kandrashova, V. A. Pavlov, and A. V. Pavlov, “The Bayesian method and GMDH in a classification problem with heterogeneous variables,” in: Proc. Intern. Sci.-Techn. Conf. “Geoinformation systems and computer technologies of ecological and economical monitoring,” GVUZ NSU, Dnipropetrovsk (2014), p. 4.
[9] A. S. Dovbysh, N. N. Budnyk, and V. V. Moskalenko, “Information-extreme algorithm for optimizing parameters of hyperellipsoidal containers of recognition classes,” Journal of Automation and Information Sciences, 44, No. 10, 35-44 (2012).
[10] A. S. Dovbysh, Foundations for Designing Intelligent Systems [in Ukrainian], SumDU, Sumy (2009).
[11] V. V. Moskalenko, A. S. Dovbysh, and A. S. Rizhova, “Intelligent automated control system with optimization of time parameters of input data analysis,” The Visnyk of the Sumy State University, No. 3, 7-14 (2013).
[12] V. S. Suzdal, P. E. Stadnik, L. I. Gerasimchuk, and Yu. M. Yepifanov, Scintillation Monocrystals: Automated Growing [in Russian], ISMA, Kharkiv (2009).
[13] R. Sipos, D. Fradkin, F. Moerchen, and Z. Wang, “Log-based predictive maintenance,” in: Proc. 20th ACM SIGKDD Intern. Conf. on Knowledge Discovery and Data Mining, ACM, New York (2014), pp. 1867-1876.
[14] Weka.waikato.ac.nz.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.