Efficient preconditioned stochastic gradient descent for estimation in latent variable models

From MaRDI portal
Publication:6441111

arXiv2306.12841MaRDI QIDQ6441111FDOQ6441111

Charlotte Baey, Jean-Benoist Leger, M. Delattre, Sarah Lemler, Estelle Kuhn

Publication date: 22 June 2023

Abstract: Latent variable models are powerful tools for modeling complex phenomena involving in particular partially observed data, unobserved variables or underlying complex unknown structures. Inference is often difficult due to the latent structure of the model. To deal with parameter estimation in the presence of latent variables, well-known efficient methods exist, such as gradient-based and EM-type algorithms, but with practical and theoretical limitations. In this paper, we propose as an alternative for parameter estimation an efficient preconditioned stochastic gradient algorithm. Our method includes a preconditioning step based on a positive definite Fisher information matrix estimate. We prove convergence results for the proposed algorithm under mild assumptions for very general latent variables models. We illustrate through relevant simulations the performance of the proposed methodology in a nonlinear mixed effects model and in a stochastic block model.




Has companion code repository: https://gitlab.com/jbleger/sbm_with_fisher-sgd









This page was built for publication: Efficient preconditioned stochastic gradient descent for estimation in latent variable models

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6441111)