Levitz, Michael; Perlman, Michael D.; Madigan, David Separation and completeness properties for AMP chain graph Markov models. (English) Zbl 1043.62080 Ann. Stat. 29, No. 6, 1751-1784 (2001). Summary: J. Pearl’s [Probabilistic reasoning in intelligent systems: networks of plausible inference. (1989; Zbl 0746.68089)] well-known \(d\)-separation criterion for an acyclic directed graph (ADG) is a pathwise separation criterion that can be used to efficiently identify all valid conditional independence relations in the Markov model determined by the graph. This paper introduces \(p\)-separation, a pathwise separation criterion that efficiently identifies all valid conditional independences under the S. A. Andersson, D. Madigan and M. D. Perlman (AMP) alternative Markov property for chain graphs (= adicyclic graphs) [Scand. J. Stat. 24, 81–102 (1997; Zbl 0918.60050)], which include both ADGs and undirected graphs as special cases. The equivalence of \(p\)-separation to the augmentation criterion occurring in the AMP global Markov property is established, and p-separation is applied to prove completeness of the global Markov property for AMP chain graph models. Strong completeness of the AMP Markov property is established, that is, the existence of Markov perfect distributions that satisfy those and only those conditional independences implied by the AMP property (equivalently, by \(p\)-separation). A linear-time algorithm for determining \(p\)-separation is presented. Cited in 15 Documents MSC: 62M45 Neural nets and related approaches to inference from stochastic processes 68R10 Graph theory (including graph drawing) in computer science 05C90 Applications of graph theory 65C60 Computational problems in statistics (MSC2010) Keywords:Bayesian networks; completeness; efficient algorithm Citations:Zbl 0746.68089; Zbl 0918.60050 Software:TETRAD × Cite Format Result Cite Review PDF Full Text: DOI References: [1] Andersson, S. A., Madigan, D. and Perlman, M. D. (1996). An alternative Markov property for chain graphs. In Uncertainty in Artificial Intelligence: Proceedings of the Twelfth Conference (F. Jensen and E. Horvitz, eds.) 40-48. Morgan Kaufmann, San Francisco. Andersson, S. A., Madigan, D. and Perlman, M. D. (1997a). On the Markov equivalence of chain graphs, undirected graphs and acyclic digraphs. Scand. J. Statist. 24 81-102. Andersson, S. A., Madigan, D. and Perlman, M. D. (1997b). A characterization of Markov equivalence classes for acyclic digraphs. Ann. Statist. 25 505-541. [2] Andersson, S. A., Madigan, D. and Perlman, M. D. (2001). Alternative Markov properties for chain graphs. Scand. J. Statist. 28 33-85. · Zbl 0972.60067 · doi:10.1111/1467-9469.00224 [3] Andersson, S. A. and Perlman, M. D. (1998). Normal linear regression models with recursive graphical Markov structure. J. Multivariate Anal. 66 133-187. · Zbl 1127.62377 · doi:10.1006/jmva.1998.1745 [4] Bouckaert, R. R. and Studený, M. (1995). Chain graphs: semantics and expressiveness. In Symbolic and Quantitative Approaches to Reasoning and Uncertainty. Lecture Notes in AI 946 67-76. Springer, New York. [5] Buntine, W. L. (1995). Chain graphs for learning. In Uncertainty in Artificial Intelligence: Proceedings of the Eleventh Conference (P. Besnard and S. Hanks, eds.) 87-98. Morgan Kaufmann, San Francisco. [6] Cowell, R. G., Dawid, A. P., Lauritzen, S. L. and Spiegelhalter, D. J. (1999). Probabilistic Networks and Expert Systems. Springer, New York. · Zbl 0937.68121 [7] Cox, D. R. (1999). Graphical Markov models: statistical motivation. Paper presented at the Workshop on Conditional Independence Structures and Graphical Models, Fields Institute for Research in Mathematical Sciences, Toronto, Canada. [8] Cox, D. R. and Wermuth, N. (1996). Multivariate Dependencies: Models, Analysis and Interpretation. Chapman and Hall, London. Frydenberg, M. (1990a). The chain graph Markov property. Scand. J. Statist. 17 333-353. Frydenberg, M. (1990b). Marginalization and collapsibilityin graphical interaction models. Ann. Statist. 18 790-805. · Zbl 0880.62124 [9] Geiger, D. and Pearl, J. (1988). On the logic of causal models. In Proceedings of the Fourth Workshop on Uncertainty in AI (R. Shachter, T. Levitt, L. Kanal and J. Lemmer, eds.) 136-147. North-Holland, Amsterdam. [10] Geiger, D., Verma, T. and Pearl, J. (1990). Identifying independence in Bayesian networks. Networks 20 507-534. · Zbl 0724.05066 · doi:10.1002/net.3230200504 [11] Heckerman, D., Geiger, D. and Chickering, D. M. (1995). Learning Bayesian networks: the combination of knowledge and statistical data. Machine Learning 20 197-243. · Zbl 0831.68096 [12] Højsgaard, S. and Thiesson, B. (1995). BIFROST:Block recursive models induced from relevant knowledge, observations, and statistical techniques. Comput. Statist. Data Anal. 19 155-175. · Zbl 0875.62566 · doi:10.1016/0167-9473(93)E0054-8 [13] Koster, J. T. A. (1999). Linear Structural Equations and Graphical Models. The Fields Institute, Toronto, Canada. [14] Lauritzen, S. L. (1996). Graphical Models. Oxford Univ. Press. · Zbl 0907.62001 [15] Lauritzen, S. L., Dawid, A. P., Larsen, B. N. and Leimer, H. G. (1990). Independence properties of directed Markov fields. Networks 20 491-505. · Zbl 0743.05065 · doi:10.1002/net.3230200503 [16] Lauritzen, S. L. and Wermuth, N. (1989). Graphical models for association between variables, some of which are qualitative and some quantitative. Ann. Statist. 17 31-57. · Zbl 0669.62045 · doi:10.1214/aos/1176347003 [17] Madigan, D., Andersson, S. A., Perlman, M. D. and Volinsky, C. M. (1996). Bayesian model averaging and model selection for Markov equivalence classes of acyclic diagraphs. Comm. Statist. Theory Methods Ser. A 25 2493-2519. · Zbl 0894.62032 · doi:10.1080/03610929608831853 [18] Meek, C. (1995). Strong completeness and faithfulness in Bayesian networks. In Uncertainty in Artificial Intelligence: Proceedings of the Eleventh Conference (P. Besnard and S. Hanks, eds.) 403-410. Morgan Kaufmann, San Mateo, CA. [19] Okamoto, M. (1973). Distinctness of the eigenvalues of a quadratic form in a multivariate sample. Ann. Statist. 1 763-765. · Zbl 0261.62043 · doi:10.1214/aos/1176342472 [20] Pearl, J. (1988). Probabilistic Reasoning in Intelligent Systems. Morgan Kaufmann, San Mateo, CA. · Zbl 0746.68089 [21] Spiegelhalter, D. J., Dawid, A. P., Lauritzen, S. L. and Cowell, R. G. (1993). Bayesian analysis in expert systems (with discussion). Statist. Sci. 8 219-283. · Zbl 0955.62523 · doi:10.1214/ss/1177010888 [22] Spirtes, P., Glymour, C. and Scheines, R. (1993). Causation, Prediction, and Search. Lecture Notes in Statist. 81 Springer, New York. · Zbl 0806.62001 [23] Studený, M. (1996). On separation criterion and recoveryalgorithm for chain graphs. In Uncertainty in Artificial Intelligence: Proceedings of the Twelfth Conference (F. Jensen and E. Horvitz, eds.) 509-516. Morgan Kaufmann, San Francisco. [24] Studený, M. (1997). A recoveryalgorithm for chain graphs. Internat. J. Approx. Reason. 17 265-293. · Zbl 0939.68096 · doi:10.1016/S0888-613X(97)00018-2 [25] Studený, M. (1998). Bayesian networks from the point of view of chain graphs. In Uncertainty in Artificial Intelligence: Proceedings of the Fourteenth Conference (G. F. Cooper and S. Moral, eds.) 496-503. Morgan Kaufmann, San Francisco. [26] Studený, M. and Bouckaert, R. R. (1998). On chain graph models for description of conditional independence structure. Ann. Statist. 26 1434-1495. · Zbl 0930.62066 · doi:10.1214/aos/1024691250 [27] Wermuth, N. and Lauritzen, S. L. (1990). On substantive research hypotheses, conditional independence graphs, and graphical chain models (with discussion). J. Roy. Statist. Soc. Ser. B 52 21-72. JSTOR: · Zbl 0749.62004 [28] Whittaker, J. (1990). Graphical Models in Applied Multivariate Statistics. Wiley, New York. · Zbl 0732.62056 This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.