# zbMATH — the first resource for mathematics

##### Examples
 Geometry Search for the term Geometry in any field. Queries are case-independent. Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact. "Topological group" Phrases (multi-words) should be set in "straight quotation marks". au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted. Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff. "Quasi* map*" py: 1989 The resulting documents have publication year 1989. so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14. "Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic. dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles. py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses). la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

##### Operators
 a & b logic and a | b logic or !ab logic not abc* right wildcard "ab c" phrase (ab c) parentheses
##### Fields
 any anywhere an internal document identifier au author, editor ai internal author identifier ti title la language so source ab review, abstract py publication year rv reviewer cc MSC code ut uncontrolled term dt document type (j: journal article; b: book; a: book article)
A conditional independence algorithm for learning undirected graphical models. (English) Zbl 1186.68356
Summary: When it comes to learning graphical models from data, approaches based on conditional independence tests are among the most popular methods. Since Bayesian networks dominate research in this field, these methods usually refer to directed graphs, and thus have to determine not only the set of edges, but also their direction. At least for a certain kind of possibilistic graphical models, however, undirected graphs are a much more natural basis. Hence, in this area, algorithms for learning undirected graphs are desirable, especially, since first learning a directed graph and then transforming it into an undirected one wastes resources and computation time. In this paper, I present a general algorithm for learning undirected graphical models, which is strongly inspired by the well-known Cheng-Bell-Liu algorithm for learning Bayesian networks from data. Its main advantage is that it needs fewer conditional independence tests, while it achieves results of comparable quality.
##### MSC:
 68T05 Learning and adaptive systems 68R10 Graph theory in connection with computer science (including graph drawing) 68W05 Nonnumerical algorithms
##### References:
 [1] Agosta, J. M.: Intel technol. J., Intel technol. J. 8, No. 4, 361-372 (2004) [2] Borgelt, C.; Kruse, R.: Evaluation measures for learning probabilistic and possibilistic networks, , 1034-1038 (1997) [3] Borgelt, C.; Kruse, R.: Efficient maximum projection of database-induced multivariate possibility distributions, (1998) [4] Borgelt, C.; Kruse, R.: Graphical models — methods for data analysis and mining, (2002) · Zbl 1017.62002 [5] Castillo, G.: Adaptive learning algorithms for Bayesian network classifiers, AI commun. 21, No. 1, 87-88 (2008) [6] Charitos, T.; Van Der Gaag, L. C.; Visscher, S.; Schurink, K. A. M.; Lucas, P. J. F.: A dynamic Bayesian network for diagnosing ventilator-associated pneumonia in ICU patients, Expert systems appl. 36, No. 2.1, 1249-1258 (2007) [7] Chickering, D. M.: Optimal structure identification with greedy search, J. Mach learn. Res. 3, 507-554 (2002) · Zbl 1084.68519 · doi:10.1162/153244303321897717 [8] Cho, S. -J.; Kim, J. H.: Bayesian network modeling of strokes and their relationship for on-line handwriting recognition, Pattern recogn. 37, No. 2, 253-264 (2003) · Zbl 1059.68103 · doi:10.1016/j.patcog.2003.01.001 [9] Cooper, G. F.; Herskovits, E.: A Bayesian method for the induction of probabilistic networks from data, Mach learn. 9, 309-347 (1992) · Zbl 0766.68109 [10] Cheng, J.; Bell, D. A.; Liu, W.: Learning belief networks from data: an information theory based approach, , 325-331 (1997) [11] Cheng, J.; Greiner, R.; Kelly, J.; Bell, D. A.; Liu, W.: Learning Bayesian networks from data: an information theory based approach, Artificial intelligence 137, No. 1 – 2, 43-90 (2002) · Zbl 0995.68114 · doi:10.1016/S0004-3702(02)00191-1 [12] Buntine, W.: Operations for learning with graphical models, J. artificial intelligence res. 2, 159-225 (1994) [13] De Campos, L. M.; Huete, J. F.; Moral, S.: Independence in uncertainty theories and its application to learning belief networks, DRUMS handbook on abduction and learning, 391-434 (2000) · Zbl 0970.68133 [14] Castillo, E.; Gutierrez, J. M.; Hadi, A. S.: Expert systems and probabilistic network models, (1997) [15] Chow, C. K.; Liu, C. N.: Approximating discrete probability distributions with dependence trees, IEEE trans. Inform. theory 14, No. 3, 462-467 (1968) · Zbl 0165.22305 · doi:10.1109/TIT.1968.1054142 [16] Frankel, J.; Webster, M.; King, S.: Articulatory feature recognition using dynamic Bayesian networks, Comput. speech and lang. 21, No. 4, 620-640 (2007) [17] Gamez, J. A.; Moral, S.; Salmeron, A.: Advances in Bayesian networks, (2004) [18] J. Gebhardt, Learning from data: Possibilistic graphical models, Habilitation Thesis, University of Braunschweig, Germany, 1997 [19] , , 46 (2004) [20] Heckerman, D.; Geiger, D.; Chickering, D. M.: Learning Bayesian networks: the combination of knowledge and statistical data, Mach learn. 20, 197-243 (1995) · Zbl 0831.68096 [21] Jensen, F. V.: Bayesian networks and decision graphs, (2001) [22] , Learning in graphical models (1998) [23] Khanafar, R. M.; Solana, B.; Triola, J.; Barco, R.; Moltsen, L.; Altman, Z.; Lazaro, P.: Automated diagnosis for UMTS networks using a Bayesian network approach, IEEE trans. Veh. technol. 57, No. 4, 2451-2461 (2008) [24] Kim, S.; Imoto, S.; Miyano, S.: Dynamic Bayesian network and nonparametric regression for nonlinear modeling of gene networks from time series gene expression data, Biosystems 75, No. 1 – 3, 57-65 (2004) [25] Lauritzen, S. L.: Graphical models, (1996) [26] Neapolitan, R. E.: Learning Bayesian networks, (2004) [27] Niculescu, R. S.; Mitchell, T. M.; Rao, R. B.: Bayesian network learning with parameter constraints, J. Mach learn. Res. 7, 1357-1383 (2006) · Zbl 1222.68275 · doi:http://www.jmlr.org/papers/v7/niculescu06a.html [28] Pearl, J.: Probabilistic reasoning in intelligent systems: networks of plausible inference, (1988) [29] Pernkopf, F.: Detection of surface defects on raw steel blocks using Bayesian network classifiers, Pattern anal. Appl. 7, No. 3, 333-342 (2004) [30] Rasmussen, L. K.: Blood group determination of danish jersey cattle in the F-blood group system, Dina res. Rep. 8 (1992) [31] Robles, V.; Larrañaga, R.; Pena, J.; Menasalvas, E.; Perez, M.; Herves, V.; Wasilewska, A.: Bayesian network multi-classifiers for protein secondary structure identification, Artificial intelligence med. 31, No. 2, 117-136 (2004) [32] Roos, T.; Wettig, H.; Grünwald, P.; Myllymäki, P.; Tirri, H.: On discriminative Bayesian network classifiers and logistic regression, Mach learn. 65, No. 1, 31-78 (2005) [33] Schneiderman, H.: Learning a restricted Bayesian network for object recognition, , 639-646 (2004) [34] Singh, M.; Valtorta, M.: An algorithm for the construction of Bayesian network structures from data, , 259-265 (1993) [35] Spirtes, P.; Glymour, C.; Scheines, R.: Causation, prediction, and search, Lecture notes in statist. 81 (1993) · Zbl 0806.62001 [36] H. Steck, Constraint-based structural learning in Bayesian networks using finite data sets, PhD thesis, TU München, Germany, 2001 [37] Tsamardinos, I.; Brown, L. E.; Aliferis, C. F.: The MAX – MIN Hill-climbing Bayesian network structure learning algorithm, Mach learn. 65, No. 1, 31-78 (2006) [38] Whittaker, J.: Graphical models in applied multivariate statistics, (1990)