An Adaptive Surrogate Modeling Based on Deep Neural Networks for Large-Scale Bayesian Inverse Problems

From MaRDI portal
Publication:5162376

DOI10.4208/cicp.OA-2020-0186zbMath1482.65206arXiv1911.08926OpenAlexW3103198077MaRDI QIDQ5162376

Liang Yan, Tao Zhou

Publication date: 2 November 2021

Published in: Communications in Computational Physics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1911.08926



Related Items

Calibrate, emulate, sample, Multi-fidelity Bayesian neural networks: algorithms and applications, Randomized approaches to accelerate MCMC algorithms for Bayesian inverse problems, Image inversion and uncertainty quantification for constitutive laws of pattern formation, Bayesian inversion using adaptive polynomial chaos kriging within subset simulation, Surrogate and inverse modeling for two-phase flow in porous media via theory-guided convolutional neural network, An Acceleration Strategy for Randomize-Then-Optimize Sampling Via Deep Neural Networks, Ensemble Inference Methods for Models With Noisy and Expensive Likelihoods, MFNets: data efficient all-at-once learning of multifidelity surrogates as directed networks of information sources, A Bayesian scheme for reconstructing obstacles in acoustic waveguides, Adaptive weighting of Bayesian physics informed neural networks for multitask and multiscale forward and inverse problems, Probabilistic neural data fusion for learning from an arbitrary number of multi-fidelity data sets, An Adaptive Non-Intrusive Multi-Fidelity Reduced Basis Method for Parameterized Partial Differential Equations, Adaptive Ensemble Kalman Inversion with Statistical Linearization, Surrogate modeling for Bayesian inverse problems based on physics-informed neural networks, Sequential Model Correction for Nonlinear Inverse Problems, Residual-based error correction for neural operator accelerated Infinite-dimensional Bayesian inverse problems, Scalable conditional deep inverse Rosenblatt transports using tensor trains and gradient-based dimension reduction, Deep Importance Sampling Using Tensor Trains with Application to a Priori and a Posteriori Rare Events, Efficient multifidelity likelihood-free Bayesian inference with adaptive computational resource allocation, Gradient-free Stein variational gradient descent with kernel approximation, Stein variational gradient descent with local approximations, Cholesky-Based Experimental Design for Gaussian Process and Kernel-Based Emulation and Calibration, Goal-oriented a-posteriori estimation of model error as an aid to parameter estimation, Multifidelity data fusion in convolutional encoder/decoder networks, Variable-order approach to nonlocal elasticity: theoretical formulation, order identification via deep learning, and applications


Uses Software


Cites Work