zbMATH — the first resource for mathematics

Asymptotically exact inference in differentiable generative models. (English) Zbl 1380.65025
Summary: Many generative models can be expressed as a differentiable function applied to input variables sampled from a known probability distribution. This framework includes both the generative component of learned parametric models such as variational autoencoders and generative adversarial networks, and also procedurally defined simulator models which involve only differentiable operations. Though the distribution on the input variables to such models is known, often the distribution on the output variables is only implicitly defined. We present a method for performing efficient Markov chain Monte Carlo inference in such models when conditioning on observations of the model output. For some models this offers an asymptotically exact inference method where approximate Bayesian computation might otherwise be employed. We use the intuition that computing conditional expectations is equivalent to integrating over a density defined on the manifold corresponding to the set of inputs consistent with the observed outputs. This motivates the use of a constrained variant of Hamiltonian Monte Carlo which leverages the smooth geometry of the manifold to move between inputs exactly consistent with observations. We validate the method by performing inference experiments in a diverse set of models.

65C60 Computational problems in statistics (MSC2010)
62F15 Bayesian inference
65C05 Monte Carlo methods
65C40 Numerical analysis or methods applied to Markov chains
60J22 Computational methods in Markov chains
Full Text: DOI Euclid