# zbMATH — the first resource for mathematics

Bayesian classifiers based on kernel density estimation: flexible classifiers. (English) Zbl 1191.68600
Summary: When learning Bayesian network based classifiers continuous variables are usually handled by discretization, or assumed that they follow a Gaussian distribution. This work introduces the kernel based Bayesian network paradigm for supervised classification. This paradigm is a Bayesian network which estimates the true density of the continuous variables using kernels. Besides, tree-augmented naive Bayes, $$k$$-dependence Bayesian classifier and complete graph classifier are adapted to the novel kernel based Bayesian network paradigm. Moreover, the strong consistency properties of the presented classifiers are proved and an estimator of the mutual information based on kernels is presented. The classifiers presented in this work can be seen as the natural extension of the flexible naive Bayes classifier proposed by John and Langley [G. H. John and P. Langley, in: Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence, 338–345 (1995)], breaking with its strong independence assumption.
Flexible tree-augmented naive Bayes seems to have superior behavior for supervised classification among the flexible classifiers. Besides, flexible classifiers presented have obtained competitive errors compared with the state-of-the-art classifiers.

##### MSC:
 68T10 Pattern recognition, speech recognition 68T05 Learning and adaptive systems in artificial intelligence
##### Software:
C4.5; KernSmooth; UCI-ml
Full Text: