Importance sampling: intrinsic dimension and computational cost
From MaRDI portal
Publication:1750255
DOI10.1214/17-STS611zbMath1442.62026arXiv1511.06196MaRDI QIDQ1750255
Andrew M. Stuart, Daniel Sanz-Alonso, Sergios Agapiou, Omiros Papaspiliopoulos
Publication date: 18 May 2018
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1511.06196
importance samplinginverse problemsfilteringabsolute continuitycomputational costsmall noisenotions of dimension
Computational methods for problems pertaining to statistics (62-08) Sampling theory, sample surveys (62D05)
Related Items (53)
A Variational Inference Approach to Inverse Problems with Gamma Hyperpriors ⋮ An Invitation to Sequential Monte Carlo Samplers ⋮ Importance Sampling and Necessary Sample Size: An Information Theory Approach ⋮ Continuum Limits of Posteriors in Graph Bayesian Inverse Problems ⋮ Reduced modeling of unknown trajectories ⋮ Unnamed Item ⋮ The sample size required in importance sampling ⋮ Unnamed Item ⋮ Certified dimension reduction in nonlinear Bayesian inverse problems ⋮ Autodifferentiable Ensemble Kalman Filters ⋮ On a Metropolis-Hastings importance sampling estimator ⋮ Moment matching adaptive importance sampling with skew-Student proposals ⋮ Iterative importance sampling with Markov chain Monte Carlo sampling in robust Bayesian analysis ⋮ Overcoming the timescale barrier in molecular dynamics: Transfer operators, variational principles and machine learning ⋮ A unified performance analysis of likelihood-informed subspace methods ⋮ Reduced-order autodifferentiable ensemble Kalman filters ⋮ Scaling limits in computational Bayesian inversion ⋮ Rethinking the Effective Sample Size ⋮ Gradient-based adaptive importance samplers ⋮ Properties of marginal sequential Monte Carlo methods ⋮ Context-Aware Surrogate Modeling for Balancing Approximation and Sampling Costs in Multifidelity Importance Sampling and Bayesian Inverse Problems ⋮ Ensemble MCMC: accelerating pseudo-marginal MCMC for state space models using the ensemble Kalman filter ⋮ A fast particle-based approach for calibrating a 3-D model of the Antarctic ice sheet ⋮ Analysis of a Computational Framework for Bayesian Inverse Problems: Ensemble Kalman Updates and MAP Estimators under Mesh Refinement ⋮ Distilling Importance Sampling for Likelihood Free Inference ⋮ Efficient importance sampling in low dimensions using affine arithmetic ⋮ A Continuation Method in Bayesian Inference ⋮ Scalable Optimization-Based Sampling on Function Space ⋮ A PRticle filter algorithm for nonparametric estimation of multivariate mixing distributions ⋮ Variance analysis of multiple importance sampling schemes ⋮ Adaptive tuning of Hamiltonian Monte Carlo within sequential Monte Carlo ⋮ Iterative updating of model error for Bayesian inversion ⋮ Localization for MCMC: sampling high-dimensional posterior distributions with local structure ⋮ Kernel Methods for Bayesian Elliptic Inverse Problems on Manifolds ⋮ A Practical Example for the Non-linear Bayesian Filtering of Model Parameters ⋮ MALA-within-Gibbs Samplers for High-Dimensional Distributions with Sparse Conditional Structure ⋮ Multilevel Sequential Importance Sampling for Rare Event Estimation ⋮ Importance sampling: intrinsic dimension and computational cost ⋮ Multilevel ensemble Kalman filtering for spatio-temporal processes ⋮ Convergence rates for optimised adaptive importance samplers ⋮ Pricing discretely-monitored double barrier options with small probabilities of execution ⋮ Multilevel sequential Monte Carlo for Bayesian inverse problems ⋮ Symmetrized importance samplers for stochastic differential equations ⋮ Rates of contraction of posterior distributions based on \(p\)-exponential priors ⋮ Efficient probabilistic reconciliation of forecasts for real-valued and count time series ⋮ Data-driven forward discretizations for Bayesian inversion ⋮ Improving Approximate Bayesian Computation via Quasi-Monte Carlo ⋮ Bayesian Parameter Identification in Cahn--Hilliard Models for Biological Growth ⋮ Data assimilation: The Schrödinger perspective ⋮ Product-form estimators: exploiting independence to scale up Monte Carlo ⋮ A weighted discrepancy bound of quasi-Monte Carlo importance sampling ⋮ The Ensemble Kalman Filter for Rare Event Estimation ⋮ Quasi-Monte Carlo and Multilevel Monte Carlo Methods for Computing Posterior Expectations in Elliptic Inverse Problems
Uses Software
Cites Work
- Can local particle filters beat the curse of dimensionality?
- Oracle-type posterior contraction rates in Bayesian inverse problems
- Bayesian inverse problems with non-conjugate priors
- Non-Gaussian statistical inverse problems. Part I: Posterior distributions
- Non-Gaussian statistical inverse problems. II: Posterior convergence for approximated unknowns
- Evaluation for moments of a ratio with application to regression estimation
- Bayesian inverse problems with Gaussian priors
- On the convergence of two sequential Monte Carlo methods for maximum a posteriori sequence estimation and stochastic global optimization
- The expected ratio of the sum of squares to the square of the sum
- Estimation of high-dimensional prior and posterior covariance matrices in Kalman filter vari\-ants
- Parameter estimation by implicit sampling
- Fundamentals of stochastic filtering
- A note on auxiliary particle filters
- The ratio of the extreme to the sum in a random sequence
- A note on Metropolis-Hastings kernels for general state spaces
- The sample size required in importance sampling
- Importance sampling: intrinsic dimension and computational cost
- Statistical and computational inverse problems.
- Sharp adaptation for inverse problems with random noise
- Smoothness and dimension reduction in quasi-Monte Carlo methods
- Large deviations and importance sampling for systems of slow-fast motion
- Bayesian posterior contraction rates for linear severely ill-posed inverse problems
- Optimal rates for the regularized least-squares algorithm
- Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference
- Discrepancy based model selection in statistical inverse problems
- Posterior contraction rates for the Bayesian approach to linear ill-posed inverse problems
- Monte Carlo strategies in scientific computing.
- Another look at rejection sampling through importance sampling
- On the stability of sequential Monte Carlo methods in high dimensions
- Well-posed stochastic extensions of ill-posed linear problems
- Posterior consistency and convergence rates for Bayesian inversion with hypoelliptic operators
- Small-Noise Analysis and Symmetrization of Implicit Monte Carlo Samplers
- Bridging the ensemble Kalman and particle filters
- Analysis of the Hessian for inverse scattering problems: I. Inverse shape scattering of acoustic waves
- Inverse problems: A Bayesian perspective
- A Stochastic Newton MCMC Method for Large-Scale Statistical Inverse Problems with Application to Seismic Inversion
- Importance Sampling for Multiscale Diffusions
- Likelihood-informed dimension reduction for nonlinear inverse problems
- Active Subspaces
- Importance Sampling and Necessary Sample Size: An Information Theory Approach
- Optimal Low-rank Approximations of Bayesian Linear Inverse Problems
- Pushing the Limits of Contemporary Statistics: Contributions in Honor of Jayanta K. Ghosh
- Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems
- Sequential Imputations and Bayesian Missing Data Problems
- Sequential Monte Carlo Methods for Dynamic Systems
- Filtering via Simulation: Auxiliary Particle Filters
- Safe and Effective Importance Sampling
- Linear and Nonlinear Inverse Problems with Practical Applications
- Rare Event Simulation of Small Noise Diffusions
- On a Likelihood Approach for Monte Carlo Integration
- Bayesian Measures of Model Complexity and Fit
- Linear inverse problems for generalised random variables
- Linear estimators and measurable linear transformations on a Hilbert space
- On Choosing and Bounding Probability Metrics
- Bayesian Recovery of the Initial Condition for the Heat Equation
- An Introduction to Sequential Monte Carlo
- A stable particle filter for a class of high-dimensional state-space models
- Stochastic Approximation in Monte Carlo Computation
- A survey of convergence results on particle filtering methods for practitioners
- Methods of Reducing Sample Size in Monte Carlo Computations
- A Computational Framework for Infinite-Dimensional Bayesian Inverse Problems Part I: The Linearized Case, with Application to Global Seismic Inversion
- Two-Stage Importance Sampling With Mixture Proposals
- Posterior consistency for Bayesian inverse problems through stability and regression results
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality
- On the best constant in Marcinkiewicz-Zygmund inequality
- On the role of interaction in sequential Monte Carlo algorithms
- MCMC methods for functions: modifying old algorithms to make them faster
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Importance sampling: intrinsic dimension and computational cost