×

zbMATH — the first resource for mathematics

Quantitative approximation of the probability distribution of a Markov process by formal abstractions. (English) Zbl 1342.60120
Summary: The goal of this work is to formally abstract a Markov process evolving in discrete time over a general state space as a finite-state Markov chain, with the objective of precisely approximating its state probability distribution in time, which allows for its approximate, faster computation by that of the Markov chain. The approach is based on formal abstractions and employs an arbitrary finite partition of the state space of the Markov process, and the computation of average transition probabilities between partition sets. The abstraction technique is formal, in that it comes with guarantees on the introduced approximation that depend on the diameters of the partitions: as such, they can be tuned at will. Further, in the case of Markov processes with unbounded state spaces, a procedure for precisely truncating the state space within a compact set is provided, together with an error bound that depends on the asymptotic properties of the transition kernel of the original process. The overall abstraction algorithm, which practically hinges on piecewise constant approximations of the density functions of the Markov process, is extended to higher-order function approximations: these can lead to improved error bounds and associated lower computational requirements. The approach is practically tested to compute the probabilistic invariance of the Markov process under study, and is compared to a known alternative approach from the literature.

MSC:
60J05 Discrete-time Markov processes on general state spaces
60J10 Markov chains (discrete-time Markov processes on discrete state spaces)
60J22 Computational methods in Markov chains
60F05 Central limit and other weak theorems
65C40 Numerical analysis or methods applied to Markov chains
Software:
PRISM; MRMC; MPT
PDF BibTeX XML Cite
Full Text: DOI arXiv