Performance study of marginal posterior density estimation via Kullback-Leibler divergence
From MaRDI portal
Publication:1382945
DOI10.1007/BF02564702zbMath0905.62021OpenAlexW1993778231MaRDI QIDQ1382945
Publication date: 7 February 1999
Published in: Test (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf02564702
Markov chain Monte Carlokernel density estimationBayesian computationconditional marginal density estimationimportance-weighted marginal density estimation
Density estimation (62G07) Bayesian inference (62F15) Monte Carlo methods (65C05) Statistical aspects of information-theoretic topics (62B10)
Related Items (3)
Use in practice of importance sampling for repeated MCMC for Poisson models ⋮ Using MCMC chain outputs to efficiently estimate Bayes factors ⋮ Variable selection for multivariate logistic regression models
Cites Work
- On Kullback-Leibler loss and density estimation
- On global properties of variable bandwidth density estimators
- Markov chains for exploring posterior distributions. (With discussion)
- Weighted distributions viewed in the context of model selection: A Bayesian perspective
- The Berry-Esseen bound for Student's statistic
- Marginal Likelihood from the Gibbs Output
- Approximation Theorems of Mathematical Statistics
- Sampling-Based Approaches to Calculating Marginal Densities
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- Importance-Weighted Marginal Bayesian Posterior Density Estimation
- Computing Bayes Factors Using a Generalization of the Savage-Dickey Density Ratio
- Monte Carlo sampling methods using Markov chains and their applications
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Performance study of marginal posterior density estimation via Kullback-Leibler divergence