Kochman, Fred; Reeds, Jim A simple proof of Kaijser’s unique ergodicity result for hidden Markov \(\alpha\)-chains. (English) Zbl 1121.60077 Ann. Appl. Probab. 16, No. 4, 1805-1815 (2006). Summary: According to a result of T. Kaijser [Ann. Probab. 3, 677–696 (1975; Zbl 0315.60038)], if some nonvanishing product of hidden Markov model (HMM) stepping matrices is subrectangular, and the underlying chain is aperiodic, the corresponding \(\alpha\)-chain has a unique invariant limiting measure \(\lambda\). Here the \(\alpha\)-chain \(\{\alpha_n\}= \{(\alpha_{ni})\}\) is given by \(\alpha_{ni}= p(X_n=i\mid Y_n,Y_{n-1},\dots)\), where \(\{(X_n,Y_n)\}\) is a finite state HMM with unobserved Markov chain component \(\{X_n\}\) and observed output component \(\{Y_n\}\). This defines \(\{\alpha_n\}\) as a stochastic process taking values in the probability simplex. It is not hard to see that \(\{\alpha_n\}\) is itself a Markov chain. The stepping matrices \(M(y)= (M(y)_{ij})\) give the probability that \((X_n,Y_n)= (j,y)\), conditional on \(X_{n-1}=i\). A matrix is said to be subrectangular if the locations of its nonzero entries form a Cartesian product of a set of row indices and a set of column indices.Kaijser’s result is based on an application of the Furstenberg-Kesten theory to the random matrix products \(M(Y_1)M(Y_2)\dots M(Y_n)\). We prove a slightly stronger form of Kaijser’s theorem with a simpler argument, exploiting the theory of e-chains. Cited in 1 ReviewCited in 7 Documents MSC: 60J10 Markov chains (discrete-time Markov processes on discrete state spaces) 60J05 Discrete-time Markov processes on general state spaces 60F99 Limit theorems in probability theory Keywords:hidden Markov models; uniform mean stability; e-chain; ergodicity Citations:Zbl 0315.60038 × Cite Format Result Cite Review PDF Full Text: DOI arXiv References: [1] Barnsley, M. F., Demko, S. G., Elton, J. H. and Geronimo, J. S. (1988). Invariant measures for Markov processes arising from iterated function systems with place-dependent probabilities. Ann. Inst. H. Poincaré Probab. Statist. 24 367–394. · Zbl 0653.60057 [2] Baum, L. E. (1972). An inequality and associated maximization technique in statistical estimation for probabilistic functions of Markov processes. In Inequalities III (O. Shisha, ed.) 1–8. Academic Press, New York. [3] Blackwell, D. (1957). The entropy of functions of finite-state Markov chains. In Trans. of the First Prague Conference on Information Theory , Statistical Decision Functions , and Random Processes 13–20. Publishing House of the Czechoslovak Academy of Sciences, Prague. · Zbl 0085.12401 [4] Doeblin, W. and Fortet, R. (1937). Sur des chaînes à liaisons complètes. Bull. Soc. Math. France 65 132–148. · Zbl 0018.03303 [5] Elliott, R., Aggoun, L. and Moore, J. B. (1995). Hidden Markov Models , Estimation and Control . Springer, New York. · Zbl 0819.60045 [6] Furstenberg, H. and Kesten, H. (1960). Products of random matrices. Ann. Math. Statist. 31 457–469. · Zbl 0137.35501 · doi:10.1214/aoms/1177705909 [7] Horn, R. A. and Johnson, C. R. (1990). Matrix Analysis . Cambridge Univ. Press. · Zbl 0704.15002 [8] Iosifescu, M. and Theodorescu, R. (1969). Random Processes and Learning . Springer, New York. · Zbl 0194.51101 [9] Kaijser, T. (1975). A limit theorem for partially observed Markov chains. Ann. Probab. 3 677–696. · Zbl 0315.60038 · doi:10.1214/aop/1176996308 [10] MacDonald, I. L. and Zucchini, W. (1997). Hidden Markov and Other Models for Discrete-Valued Time Series . Chapman and Hall, London. · Zbl 0868.60036 [11] Meyn, S. P. and Tweedie, R. L. (1993). Markov Chains and Stochastic Stability . Springer, London. · Zbl 0925.60001 This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.