an:00445330
Zbl 0778.62022
Meng, Xiao-Li; Rubin, Donald B.
Maximum likelihood estimation via the ECM algorithm: A general framework
EN
Biometrika 80, No. 2, 267-278 (1993).
00015614
1993
j
62F10 65C99
conditional maximization; constrained optimization; Gibbs sampler; incomplete data; iterated conditional modes; iterative proportional fitting; missing data; loglinear model; contingency tables; regression model; complete-data maximum likelihood estimation; generalized EM algorithms; ECM algorithm; conditional maximum likelihood estimation
Summary: Two major reasons for the popularity of the EM algorithm are that its maximum step involves only complete-data maximum likelihood estimation, which is often computationally simple, and that its convergence is stable, with each iteration increasing the likelihood. When the associated complete-data maximum likelihood estimation itself is complicated, EM is less attractive because the \(M\)-step is computationally unattractive. In many cases, however, complete-data maximum likelihood estimation is relatively simple when conditional on some function of the parameters being estimated.
We introduce a class of generalized EM algorithms, which we call the ECM algorithm, for Expectation/Conditional Maximization (CM), that takes advantage of the simplicity of complete-data conditional maximum likelihood estimation by replacing a complicated \(M\)-step of EM with several computationally simpler CM-steps. We show that the ECM algorithm shares all the appealing convergence properties of EM, such as always increasing the likelihood, and present several illustrative examples.