Deep Importance Sampling Using Tensor Trains with Application to a Priori and a Posteriori Rare Events
DOI10.1137/23m1546981arXiv2209.01941WikidataQ129530654 ScholiaQ129530654MaRDI QIDQ6189161
Tiangang Cui, Sergey V. Dolgov, Robert Scheichl
Publication date: 8 February 2024
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2209.01941
Factorization of matrices (15A23) Bayesian inference (62F15) Monte Carlo methods (65C05) Numerical analysis or methods applied to Markov chains (65C40) Algorithms for approximation of functions (65D15) Numerical quadrature and cubature formulas (65D32) Multilinear algebra, tensor calculus (15A69) Numerical methods for inverse problems for boundary value problems involving PDEs (65N21) Numerical solution of inverse problems involving ordinary differential equations (65L09)
Cites Work
- Unnamed Item
- Unnamed Item
- Tensor-Train Decomposition
- TT-cross approximation for multidimensional arrays
- An efficient algorithm for rare-event probability estimation, combinatorial optimization, and counting
- Convergence rates of best \(N\)-term Galerkin approximations for a class of elliptic SPDEs
- Bayesian inference with optimal maps
- Convergence of adaptive mixtures of importance sampling schemes
- Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
- Methods for approximating integrals in statistics with special emphasis on Bayesian integration problems
- A continuous analogue of the tensor-train decomposition
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification
- Deep composition of tensor-trains using squared inverse Rosenblatt transports
- Coupling the reduced-order model and the generative model for an importance sampling estimator
- Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion
- Adaptive multi-fidelity polynomial chaos approach to Bayesian inference in inverse problems
- Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data
- Approximation and sampling of multivariate probability distributions in the tensor train decomposition
- Scalable posterior approximations for large-scale Bayesian inverse problems via likelihood-informed parameter and state reduction
- Sparse-grid, reduced-basis Bayesian inversion: nonaffine-parametric nonlinear equations
- Multifidelity importance sampling
- A unified performance analysis of likelihood-informed subspace methods
- Analysis of tensor approximation schemes for continuous functions
- Spectral Tensor-Train Decomposition
- Sparse deterministic approximation of Bayesian inverse problems
- Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces
- Likelihood-informed dimension reduction for nonlinear inverse problems
- Alternating Minimal Energy Methods for Linear Systems in Higher Dimensions
- Data-driven model reduction for the Bayesian solution of inverse problems
- Parameter and State Model Reduction for Large-Scale Statistical Inverse Problems
- Tensor Spaces and Numerical Tensor Calculus
- Transport Map Accelerated Markov Chain Monte Carlo
- Multifidelity Preconditioning of the Cross-Entropy Method for Rare Event Simulation and Failure Probability Estimation
- Multilevel Sequential Importance Sampling for Rare Event Estimation
- Sequential Monte Carlo Samplers
- Model Reduction for Large-Scale Systems with High-Dimensional Parametric Input Space
- Sampling-free Bayesian inversion with adaptive hierarchical tensor representations
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- Error Analysis for Probabilities of Rare Events with Approximate Models
- Cross-Entropy-Based Importance Sampling with Failure-Informed Dimension Reduction for Rare Event Simulation
- Rank Bounds for Approximating Gaussian Densities in the Tensor-Train Format
- Certified dimension reduction in nonlinear Bayesian inverse problems
- An Adaptive Surrogate Modeling Based on Deep Neural Networks for Large-Scale Bayesian Inverse Problems
- Quasi-Monte Carlo and Multilevel Monte Carlo Methods for Computing Posterior Expectations in Elliptic Inverse Problems
- Non‐linear model reduction for uncertainty quantification in large‐scale inverse problems
- A Multilevel Monte Carlo Method for Computing Failure Probabilities
- Remarks on a Multivariate Transformation
- Nonlinear Reduced Models for State and Parameter Estimation
- A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data
- Simulating normalizing constants: From importance sampling to bridge sampling to path sampling
- Parallel computation of flow in heterogeneous media modelled by mixed finite elements
- Scalable conditional deep inverse Rosenblatt transports using tensor trains and gradient-based dimension reduction