Pages that link to "Item:Q4600705"
From MaRDI portal
The following pages link to Posterior consistency for Gaussian process approximations of Bayesian posterior distributions (Q4600705):
Displaying 30 items.
- Adaptive sampling-based quadrature rules for efficient Bayesian prediction (Q782004) (← links)
- Convergence rates for a class of estimators based on Stein's method (Q1740521) (← links)
- Fast sampling of parameterised Gaussian random fields (Q1987947) (← links)
- Parallel Gaussian process surrogate Bayesian inference with noisy likelihood evaluations (Q2057377) (← links)
- Consistent online Gaussian process regression without the sample complexity bottleneck (Q2058904) (← links)
- The SPDE approach to Matérn fields: graph representations (Q2092895) (← links)
- Calibrate, emulate, sample (Q2123875) (← links)
- Randomized approaches to accelerate MCMC algorithms for Bayesian inverse problems (Q2129320) (← links)
- Bayesian inversion using adaptive polynomial chaos kriging within subset simulation (Q2133745) (← links)
- Rate-optimal refinement strategies for local approximation MCMC (Q2172107) (← links)
- Chebyshev-Legendre spectral method and inverse problem analysis for the space fractional Benjamin-Bona-Mahony equation (Q2181668) (← links)
- Adaptive multi-fidelity polynomial chaos approach to Bayesian inference in inverse problems (Q2214560) (← links)
- Stein variational gradient descent with local approximations (Q2246277) (← links)
- A modern retrospective on probabilistic numerics (Q2302460) (← links)
- GParareal: a time-parallel ODE solver using Gaussian process emulation (Q2680306) (← links)
- Surrogate modeling for Bayesian inverse problems based on physics-informed neural networks (Q2683056) (← links)
- Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations (Q3130409) (← links)
- Sampling-free Bayesian inversion with adaptive hierarchical tensor representations (Q4634763) (← links)
- Consistency of empirical Bayes and kernel flow for hierarchical parameter estimation (Q4956916) (← links)
- Data-driven forward discretizations for Bayesian inversion (Q4970560) (← links)
- On the local Lipschitz stability of Bayesian inverse problems (Q5000612) (← links)
- An Acceleration Strategy for Randomize-Then-Optimize Sampling Via Deep Neural Networks (Q5079536) (← links)
- Bayesian Model Calibration with Interpolating Polynomials based on Adaptively Weighted Leja Nodes (Q5161991) (← links)
- An Adaptive Surrogate Modeling Based on Deep Neural Networks for Large-Scale Bayesian Inverse Problems (Q5162376) (← links)
- Convergence of spectral likelihood approximation based on q-Hermite polynomials for Bayesian inverse problems (Q5869782) (← links)
- Markov chain generative adversarial neural networks for solving Bayesian inverse problems in physics applications (Q6052371) (← links)
- Asymptotic Bounds for Smoothness Parameter Estimates in Gaussian Process Interpolation (Q6062242) (← links)
- Deep Learning in High Dimension: Neural Network Expression Rates for Analytic Functions in \(\pmb{L^2(\mathbb{R}^d,\gamma_d)}\) (Q6109160) (← links)
- Bayesian Inverse Problems Are Usually Well-Posed (Q6115454) (← links)
- Residual-based error correction for neural operator accelerated Infinite-dimensional Bayesian inverse problems (Q6147083) (← links)