Improving bridge estimators via \(f\)-GAN
From MaRDI portal
Publication:2080347
DOI10.1007/s11222-022-10133-yzbMath1496.62024arXiv2106.07462OpenAlexW3171868693MaRDI QIDQ2080347
Publication date: 7 October 2022
Published in: Statistics and Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2106.07462
Bayes factor\(f\)-divergencenormalizing constantsMonte Carlo estimationgenerative adversarial networknormalizing flow
Computational methods for problems pertaining to statistics (62-08) Bayesian inference (62F15) Monte Carlo methods (65C05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Properties of the bridge sampler with a focus on splitting the MCMC sample
- Default Bayesian model determination methods for generalised linear mixed models
- Estimating marginal likelihoods for mixture and Markov switching models using bridge sampling techniques*
- Divergence measures based on the Shannon entropy
- Using simulation methods for bayesian econometric models: inference, development,and communication
- Fitting Full-Information Item Factor Models and an Empirical Investigation of Bridge Sampling
- A Theory of Statistical Models for Monte Carlo Integration
- A likelihood-based method for analysing longitudinal binary responses
- Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
- Asymptotic Properties of Non-Linear Least Squares Estimators
- Simulating normalizing constants: From importance sampling to bridge sampling to path sampling
- Nested sampling for general Bayesian computation
This page was built for publication: Improving bridge estimators via \(f\)-GAN