×

zbMATH — the first resource for mathematics

Bayesian representation of stochastic processes under learning: de Finetti revisited. (English) Zbl 1056.62509
Summary: A probability distribution governing the evolution of a stochastic process has infinitely many Bayesian representations of the form \(\mu=\int_ \Theta\mu_ \theta d\lambda(\theta)\). Among these, a natural representation is one whose components \((\mu_ \theta\)’s) are ‘learnable’ (one can approximate \(\mu_ \theta\) by conditioning \(\mu\) on observation of the process) and ‘sufficient for prediction’ \((\mu_ \theta\)’s predictions are not aided by conditioning on observation of the process).
We show the existence and uniqueness of such a representation under a suitable asymptotic mixing condition on the process. This representation can be obtained by conditioning on the tail-field of the process, and any learnable representation that is sufficient for prediction is asymptotically like the tail-field representation. This result is related to the celebrated de Finetti theorem, but with exchangeability weakened to any asymptotic mixing condition, and with his conclusion of a decomposition into i.i.d. component distributions weakened to components that are learnable and sufficient for prediction.”

MSC:
62F15 Bayesian inference
60G09 Exchangeability for stochastic processes
62M99 Inference from stochastic processes
62M45 Neural nets and related approaches to inference from stochastic processes
PDF BibTeX XML Cite
Full Text: DOI