Scaling up Bayesian variational inference using distributed computing clusters
From MaRDI portal
Recommendations
Cites work
- scientific article; zbMATH DE number 1666084 (Why is no real title available?)
- scientific article; zbMATH DE number 6377992 (Why is no real title available?)
- scientific article; zbMATH DE number 1043533 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- 10.1162/jmlr.2003.3.4-5.993
- A Stochastic Approximation Method
- Adaptive subgradient methods for online learning and stochastic optimization
- Error bounds and convergence analysis of feasible descent methods: A general approach
- MLlib: machine learning in Apache Spark
- Online model selection based on the variational Bayes
- Variational message passing
Cited in
(11)- Distributed Bayesian machine learning procedures
- Distributed Bayesian matrix factorization with limited communication
- Fast approximation of variational Bayes Dirichlet process mixture using the maximization-maximization algorithm
- Distributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior server
- BayesPy: variational Bayesian inference in Python
- scientific article; zbMATH DE number 7365721 (Why is no real title available?)
- Trust-region based stochastic variational inference for distributed and asynchronous networks
- Scalable importance sampling estimation of Gaussian mixture posteriors in Bayesian networks
- Patterns of scalable Bayesian inference
- Exploiting multi-core architectures for reduced-variance estimation with intractable likelihoods
- Using Storm for scaleable sequential statistical inference
Describes a project that uses
Uses Software
This page was built for publication: Scaling up Bayesian variational inference using distributed computing clusters
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2411280)