Probability density decomposition for conditionally dependent random variables modeled by vines. (English) Zbl 1314.62040

Summary: A vine is a new graphical model for dependent random variables. Vines generalize the Markov trees often used in modeling multivariate distributions. They differ from Markov trees and Bayesian belief nets in that the concept of conditional independence is weakened to allow for various forms of conditional dependence. A general formula for the density of a vine dependent distribution is derived. This generalizes the well-known density formula for belief nets based on the decomposition of belief nets into cliques. Furthermore, the formula allows a simple proof of the Information Decomposition Theorem for a regular vine. The problem of (conditional) sampling is discussed, and Gibbs sampling is proposed to carry out sampling from conditional vine dependent distributions. The so-called ‘canonical vines’ built on highest degree trees offer the most efficient structure for Gibbs sampling.


62E10 Characterization and structure theory of statistical distributions
62B10 Statistical aspects of information-theoretic topics
62H20 Measures of association (correlation, canonical correlation, etc.)
68T30 Knowledge representation
94A17 Measures of information, entropy
Full Text: DOI