Target-aware Bayesian inference: how to beat optimal conventional estimators
From MaRDI portal
Publication:4969150
zbMATH Open1504.62036MaRDI QIDQ4969150FDOQ4969150
Authors: Tom Rainforth, Adam Goliński, Frank Wood, Sheheryar Zaidi
Publication date: 5 October 2020
Full work available at URL: https://jmlr.csail.mit.edu/papers/v21/19-102.html
Recommendations
Cites Work
- The sample size required in importance sampling
- Sequential Monte Carlo Methods in Practice
- The Bayesian Choice
- Estimating marginal likelihoods for mixture and Markov switching models using bridge sampling techniques*
- Methods for approximating integrals in statistics with special emphasis on Bayesian integration problems
- Title not available (Why is that?)
- Title not available (Why is that?)
- Simulating normalizing constants: From importance sampling to bridge sampling to path sampling
- On Monte Carlo methods for estimating ratios of normalizing constants
- Properties of nested sampling
- Adaptive multiple importance sampling
- Safe and Effective Importance Sampling
- Some summation formulas involving harmonic numbers and generalized harmonic numbers
- Methods of reducing sample size in Monte Carlo computations
- Convergence of adaptive mixtures of importance sampling schemes
- Adaptive umbrella sampling: Self-consistent determination of the non- Boltzmann bias
- Title not available (Why is that?)
- Adaptive importance sampling in monte carlo integration
- A tutorial on bridge sampling
- Layered adaptive importance sampling
- Generalized multiple importance sampling
- Title not available (Why is that?)
Cited In (2)
This page was built for publication: Target-aware Bayesian inference: how to beat optimal conventional estimators
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4969150)