Gaussian approximations for probability measures on R^d
From MaRDI portal
Publication:4636355
DOI10.1137/16M1105384zbMATH Open1390.60022arXiv1611.08642MaRDI QIDQ4636355FDOQ4636355
Authors: Yulong Lu, Hendrik Weber, A. M. Stuart
Publication date: 19 April 2018
Published in: SIAM/ASA Journal on Uncertainty Quantification (Search for Journal in Brave)
Abstract: This paper concerns the approximation of probability measures on with respect to the Kullback-Leibler divergence. Given an admissible target measure, we show the existence of the best approximation, with respect to this divergence, from certain sets of Gaussian measures and Gaussian mixtures. The asymptotic behavior of such best approximations is then studied in the small parameter limit where the measure concentrates; this asymptotic behaviour is characterized using -convergence. The theory developed is then applied to understanding the frequentist consistency of Bayesian inverse problems. For a fixed realization of noise, we show the asymptotic normality of the posterior measure in the small noise limit. Taking into account the randomness of the noise, we prove a Bernstein-Von Mises type result for the posterior measure.
Full work available at URL: https://arxiv.org/abs/1611.08642
Recommendations
- Kullback-Leibler approximation for probability measures on infinite dimensional spaces
- Algorithms for Kullback-Leibler approximation of probability measures in infinite dimensions
- Non-Gaussian statistical inverse problems. II: Posterior convergence for approximated unknowns
- Gaussian approximations of small noise diffusions in Kullback-Leibler divergence
- Gaussian approximation of general non-parametric posterior distributions
Convergence of probability measures (60B10) Bayesian inference (62F15) Stochastic calculus of variations and the Malliavin calculus (60H07)
Cites Work
- Graphical models, exponential families, and variational inference
- Asymptotic Statistics
- Pattern recognition and machine learning.
- Mathematical foundations of infinite-dimensional statistical models
- Convergence rates of posterior distributions.
- Convergence of estimates under dimensionality restrictions
- Nonparametric Bernstein-von Mises theorems in Gaussian white noise
- On the Bernstein-von Mises phenomenon for nonparametric Bayes procedures
- Asymptotic equivalence of nonparametric regression and white noise
- On the Bernstein-von Mises theorem with infinite-dimensional parameters
- Posterior contraction rates for the Bayesian approach to linear ill-posed inverse problems
- Inverse problems: a Bayesian perspective
- Title not available (Why is that?)
- Bayesian inverse problems with Gaussian priors
- Rates of convergence of posterior distributions.
- Bayesian nonparametrics
- Saddlepoint approximations
- On the Bernstein-von Mises phenomenon in the Gaussian white noise model
- The consistency of posterior distributions in nonparametric problems
- Poincaré and logarithmic Sobolev inequalities by decomposition of the energy landscape
- Algorithms for Kullback-Leibler approximation of probability measures in infinite dimensions
- Posterior consistency for Bayesian inverse problems through stability and regression results
- Improving model fidelity and sensitivity for complex systems through empirical information theory
- On the Bernstein-von Mises approximation of posterior distributions
- Kullback-Leibler approximation for probability measures on infinite dimensional spaces
- Gaussian approximations of small noise diffusions in Kullback-Leibler divergence
- Gaussian approximations for transition paths in Brownian dynamics
Cited In (22)
- Frequentist consistency of variational Bayes
- Variational Bayes for High-Dimensional Linear Regression With Sparse Priors
- Γ-convergence of Onsager–Machlup functionals: II. Infinite product measures on Banach spaces
- Title not available (Why is that?)
- Algorithms for Kullback-Leibler approximation of probability measures in infinite dimensions
- Convergence of spectral likelihood approximation based on q-Hermite polynomials for Bayesian inverse problems
- Principal feature detection via \(\phi \)-Sobolev inequalities
- Relative entropy minimization over Hilbert spaces via Robbins-Monro
- Universal Gaussian approximations under random censorship
- Consensus‐based sampling
- Generalized Modes in Bayesian Inverse Problems
- Bayesian calibration for large‐scale fluid structure interaction problems under embedded/immersed boundary framework
- Exact Lower and Upper Bounds for Gaussian Measures
- Title not available (Why is that?)
- Variational Gaussian approximation for Poisson data
- Title not available (Why is that?)
- On the convergence of the Laplace approximation and noise-level-robustness of Laplace-based Monte Carlo methods for Bayesian inverse problems
- Kullback-Leibler approximation for probability measures on infinite dimensional spaces
- On log-concave approximations of high-dimensional posterior measures and stability properties in non-linear inverse problems
- Title not available (Why is that?)
- Gaussian approximations of small noise diffusions in Kullback-Leibler divergence
- Variational Bayesian approximation of inverse problems using sparse precision matrices
Uses Software
This page was built for publication: Gaussian approximations for probability measures on \(\mathbb R^d\)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4636355)