Iterative construction of Gaussian process surrogate models for Bayesian inference
From MaRDI portal
Publication:2301102
DOI10.1016/j.jspi.2019.11.002zbMath1437.62105arXiv1911.07227OpenAlexW2988784546WikidataQ126776645 ScholiaQ126776645MaRDI QIDQ2301102
Leen Alawieh, John B. Bell, Jonathan B. Goodman
Publication date: 28 February 2020
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.07227
Gaussian processes (60G15) Bayesian inference (62F15) Applications of statistics in engineering and industry; control charts (62P30) Learning and adaptive systems in artificial intelligence (68T05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Ensemble samplers with affine invariance
- Multi-output separable Gaussian process: towards an efficient, fully Bayesian paradigm for uncertainty quantification
- Query efficient posterior estimation in scientific experiments via Bayesian active learning
- Bayesian inference with optimal maps
- Stochastic spectral methods for efficient Bayesian solution of inverse problems
- Adaptive rejection Metropolis sampling using Lagrange interpolation polynomials of degree 2
- Efficient global optimization of expensive black-box functions
- Exponential convergence of Langevin distributions and their discrete approximations
- Design and analysis of computer experiments. With comments and a rejoinder by the authors
- Active learning with adaptive regularization
- An efficient proposal distribution for Metropolis-Hastings using a \(B\)-splines technique
- Sampling the posterior: an approach to non-Gaussian data assimilation
- Dimension-independent likelihood-informed MCMC
- Using Bayesian statistics in the estimation of heat source in radiation
- Coarse-gradient Langevin algorithms for dynamic data integration and uncertainty quantification
- A general purpose sampling algorithm for continuous distributions (the t-walk)
- Bayesian Calibration of Computer Models
- Advances in the Sequential Design of Computer Experiments Based on Active Learning
- A Stochastic Newton MCMC Method for Large-Scale Statistical Inverse Problems with Application to Seismic Inversion
- Bayesian Treed Gaussian Process Models With an Application to Computer Modeling
- Bayesian Design and Analysis of Computer Experiments: Use of Derivatives in Surface Prediction
- Stochastic inverse heat conduction using a spectral approach
- Transport Map Accelerated Markov Chain Monte Carlo
- Approximation errors and model reduction with an application in optical diffusion tomography
- Computer Model Calibration Using High-Dimensional Output
- Spectral Methods for Uncertainty Quantification
- Model Reduction for Large-Scale Systems with High-Dimensional Parametric Input Space
- A Bayesian approach to model inadequacy for polynomial regression
- Adaptive Rejection Metropolis Sampling within Gibbs Sampling
- On markov chain monte carlo methods for nonlinear and non-gaussian state-space models
- Independent Doubly Adaptive Rejection Metropolis Sampling Within Gibbs Sampling
- Iterative Importance Sampling Algorithms for Parameter Estimation
- Riemann Manifold Langevin and Hamiltonian Monte Carlo Methods
- Inverse Problem Theory and Methods for Model Parameter Estimation
- Preconditioning Markov Chain Monte Carlo Simulations Using Coarse-Scale Models
This page was built for publication: Iterative construction of Gaussian process surrogate models for Bayesian inference