Speeding Up MCMC by Efficient Data Subsampling
From MaRDI portal
Publication:5231510
DOI10.1080/01621459.2018.1448827zbMath1420.62121arXiv1404.4178OpenAlexW2128709328MaRDI QIDQ5231510
Matias Quiroz, Robert Kohn, Minh-Ngoc Tran, Mattias Villani
Publication date: 27 August 2019
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1404.4178
Bayesian inferencesurvey samplingbig dataestimated likelihoodblock pseudo-marginalcorrelated pseudo-marginal
Related Items
Optimal Distributed Subsampling for Maximum Quasi-Likelihood Estimators With Massive Data, Mini-Batch Metropolis–Hastings With Reversible SGLD Proposal, Parallel Markov chain Monte Carlo for Bayesian hierarchical models with big data, in two stages, Piecewise deterministic Markov processes for continuous-time Monte Carlo, The Block-Poisson Estimator for Optimally Tuned Exact Subsampling MCMC, An Approach to Incorporate Subsampling Into a Generic Bayesian Hierarchical Model, Unnamed Item, Challenges in Markov chain Monte Carlo for Bayesian neural networks, Most likely optimal subsampled Markov chain Monte Carlo, New models for symbolic data analysis, Reflections on Bayesian inference and Markov chain Monte Carlo, A two-stage adaptive Metropolis algorithm, Distributed penalized modal regression for massive data, Finding our way in the dark: approximate MCMC for approximate Bayesian methods, Distributed computation for marginal likelihood based model choice, Optimal subsampling algorithms for composite quantile regression in massive data, Divide-and-conquer Metropolis-Hastings samplers with matched samples, An efficient adaptive MCMC algorithm for pseudo-Bayesian quantum tomography, Randomized time Riemannian manifold Hamiltonian Monte Carlo, Subsampling sequential Monte Carlo for static Bayesian models, Parallel inference for big data with the group Bayesian method, Two-stage Metropolis-Hastings for tall data, Bayesian computation: a summary of the current state, and samples backwards and forwards, Random projections for Bayesian regression, Bayesian Conditional Density Filtering, Speeding up MCMC by Delayed Acceptance and Data Subsampling, Scalable Bayesian Nonparametric Clustering and Classification, Emulation-accelerated Hamiltonian Monte Carlo algorithms for parameter estimation and uncertainty quantification in differential equation models, Subsampling MCMC -- an introduction for the survey statistician, Informed sub-sampling MCMC: approximate Bayesian inference for large datasets, Stochastic Gradient Markov Chain Monte Carlo, Optimal subsampling for least absolute relative error estimators with massive data
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The pseudo-marginal approach for efficient Monte Carlo computations
- On some properties of Markov chain Monte Carlo simulation methods based on the particle filter
- Weak convergence and optimal scaling of random walk Metropolis algorithms
- Model assisted survey sampling.
- Merging MCMC subposteriors through Gaussian-process approximations
- On Russian roulette estimates for Bayesian inference with doubly-intractable likelihoods
- Rates of convergence for the empirical quantization error
- On the efficiency of pseudo-marginal random walk Metropolis algorithms
- Speeding up MCMC by Delayed Acceptance and Data Subsampling
- Sampling-Based Approaches to Calculating Marginal Densities
- Monte carlo evaluation of functionals of solutions of stochastic differential equations. variance reduction and numerical examples
- The multivariate L 1 -median and associated data depth
- A Parallel Mixture of SVMs for Very Large Scale Problems
- Efficient implementation of Markov chain Monte Carlo when using an unbiased likelihood estimator
- A Generalization of Sampling Without Replacement From a Finite Universe