Inference in hidden Markov models.

*(English)*Zbl 1080.62065
Springer Series in Statistics. New York, NY: Springer (ISBN 0-387-40264-0/hbk). xvii, 652 p. (2005).

This monograph is an attempt to present a reasonably complete up-to-date picture of the field of Hidden Markov Models (HHM) that is self-contained from a theoretical point of view and self-sufficient from a methodological point of view. The objective is not restricted to models with finite state space, but also includes models with continuous state space. The special cases of HMMs with finite state space and Gaussian linear state-space models are detailed in separate chapters.

The book starts with an introductory chapter which explains HMMs, and gives examples of the use of HMMs in a variety of fields. This chapter also describes various extensions of HMMs, like models with autoregression. The main definitions and notations are presented next. The first part is on state inference. This part has seven chapters. The main keywords are smoothing, filtering, and Monte Carlo methods.

The second part, 4 chapters, is on parametric inference. Mainly, maximum likelihood inference is explored. But one chapter is on fully Bayesian approaches, too.

Background and complements are offered in Part 3. A primer on Markov chains and an information-theoretic perspective on order estimation are presented in the two chapters. Appendices present basic fact about conditionig and linear prediction.

The book is written for academic researchers in the field of HMMs, and also for practitioners and researchers from other fields. This broad audience is served by summing up the results obtained so far and presenting some new ideas on the one hand. On the other hand, the reader is led through the computational steps required for making inference in HMMs. Also, the text provides him with the relevant underlying statistical theory. Familiarity with probability and statistical estimation is assumed. The primer on Markov chains in part III is more a brush-up than a comprehensive treatise of the subject. Besides giving proofs for the theorems, all the theory is illustrated with relevant running examples.

This voluminous book has indeed the potential to become a standard text on HMM.

The book starts with an introductory chapter which explains HMMs, and gives examples of the use of HMMs in a variety of fields. This chapter also describes various extensions of HMMs, like models with autoregression. The main definitions and notations are presented next. The first part is on state inference. This part has seven chapters. The main keywords are smoothing, filtering, and Monte Carlo methods.

The second part, 4 chapters, is on parametric inference. Mainly, maximum likelihood inference is explored. But one chapter is on fully Bayesian approaches, too.

Background and complements are offered in Part 3. A primer on Markov chains and an information-theoretic perspective on order estimation are presented in the two chapters. Appendices present basic fact about conditionig and linear prediction.

The book is written for academic researchers in the field of HMMs, and also for practitioners and researchers from other fields. This broad audience is served by summing up the results obtained so far and presenting some new ideas on the one hand. On the other hand, the reader is led through the computational steps required for making inference in HMMs. Also, the text provides him with the relevant underlying statistical theory. Familiarity with probability and statistical estimation is assumed. The primer on Markov chains in part III is more a brush-up than a comprehensive treatise of the subject. Besides giving proofs for the theorems, all the theory is illustrated with relevant running examples.

This voluminous book has indeed the potential to become a standard text on HMM.

Reviewer: R. Schlittgen (Hamburg)