Properties of the stochastic approximation EM algorithm with mini-batch sampling
From MaRDI portal
Publication:2209731
DOI10.1007/s11222-020-09968-0zbMath1458.62184arXiv1907.09164OpenAlexW3083259475MaRDI QIDQ2209731
Catherine Matias, Estelle Kuhn
Publication date: 4 November 2020
Published in: Statistics and Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1907.09164
Point estimation (62F10) Monte Carlo methods (65C05) Stochastic approximation (62L20) Statistical aspects of big data and data science (62R07)
Related Items (3)
Global implicit function theorems and the online expectation–maximisation algorithm ⋮ On the convergence of stochastic approximations under a subgeometric ergodic Markov dynamic ⋮ On the curved exponential family in the Stochastic Approximation Expectation Maximization Algorithm
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Modeling heterogeneity in random graphs through latent space models: a selective review
- Construction of Bayesian deformable models via a stochastic approximation algorithm: a convergence study
- Maximum likelihood estimation in nonlinear mixed effects models
- Convergence of a stochastic approximation version of the EM algorithm
- Mini-batch learning of exponential family finite mixture models
- The frailty model.
- On-Line Expectation–Maximization Algorithm for latent Data Models
- On the geometric ergodicity of hybrid samplers
- Coupling a stochastic approximation version of EM with an MCMC procedure
- Towards a Coherent Statistical Framework for Dense Deformable Template Estimation
- Convergence of the Wang-Landau algorithm
- Stability of Stochastic Approximation under Verifiable Conditions
This page was built for publication: Properties of the stochastic approximation EM algorithm with mini-batch sampling