×

zbMATH — the first resource for mathematics

An improved stochastic EM algorithm for large-scale full-information item factor analysis. (English) Zbl 1440.62089
Summary: In this paper, we explore the use of the stochastic EM algorithm [G. Celeux and J. Diebolt, Comput. Stat. Q. 2, 73–82 (1985), https://ci.nii.ac.jp/naid/10010480916/en/] for large-scale full-information item factor analysis. Innovations have been made on its implementation, including an adaptive-rejection-based Gibbs sampler for the stochastic E step, a proximal gradient descent algorithm for the optimization in the M step, and diagnostic procedures for determining the burn-in size and the stopping of the algorithm. These developments are based on the theoretical results of S. F. Nielsen [Bernoulli 6, No. 3, 457–489 (2000; Zbl 0981.62022)], as well as advanced sampling and optimization techniques. The proposed algorithm is computationally efficient and virtually tuning-free, making it scalable to large-scale data with many latent traits (e.g. more than five latent traits) and easy to use for practitioners. Standard errors of parameter estimation are also obtained based on the missing-information identity [T. A. Louis, J. R. Stat. Soc., Ser. B 44, 226–233 (1982; Zbl 0488.62018)]. The performance of the algorithm is evaluated through simulation studies and an application to the analysis of the IPIP-NEO personality inventory. Extensions of the proposed algorithm to other latent variable models are discussed.

MSC:
62F10 Point estimation
62F12 Asymptotic properties of parametric estimators
62P15 Applications of statistics to psychology
62J20 Diagnostics, and linear inference and regression
62H12 Estimation in multivariate analysis
PDF BibTeX XML Cite
Full Text: DOI