Scalable conditional deep inverse Rosenblatt transports using tensor trains and gradient-based dimension reduction
DOI10.1016/J.JCP.2023.112103arXiv2106.04170MaRDI QIDQ6158090FDOQ6158090
Authors: Tiangang Cui, Sergey Dolgov, Olivier Zahm
Publication date: 31 May 2023
Published in: Journal of Computational Physics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2106.04170
Recommendations
- Deep composition of tensor-trains using squared inverse Rosenblatt transports
- Data-free likelihood-informed dimension reduction of Bayesian inverse problems
- Stein variational gradient descent on infinite-dimensional space and applications to statistical inverse problems
- Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion
- A distributed framework for the construction of transport maps
Markov chain Monte Carlodimension reductionapproximate Bayesian computationinverse problemsgenerative modelstensor traintransport maps
Parametric inference (62Fxx) Numerical methods for partial differential equations, boundary value problems (65Nxx) Probabilistic methods, stochastic differential equations (65Cxx)
Cites Work
- The no-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
- Monte Carlo strategies in scientific computing.
- Sequential Monte Carlo Samplers
- Remarks on a Multivariate Transformation
- Handbook of Markov Chain Monte Carlo
- Title not available (Why is that?)
- Parameter Estimation for Differential Equations: a Generalized Smoothing Approach
- Tensor Decompositions and Applications
- A sequential particle filter method for static models
- TT-cross approximation for multidimensional arrays
- CUR matrix decompositions for improved data analysis
- Simulating normalizing constants: From importance sampling to bridge sampling to path sampling
- Discretization-invariant Bayesian inversion and Besov space priors
- Inverse problems: a Bayesian perspective
- Parameter and state model reduction for large-scale statistical inverse problems
- A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- Pseudo-skeleton approximations by matrices of maximal volume
- Generalization of an inequality by Talagrand and links with the logarithmic Sobolev inequality
- Convergence rates of best \(N\)-term Galerkin approximations for a class of elliptic SPDEs
- Data-driven optimal transport
- A note on an inequality involving the normal distribution
- Alternating minimal energy methods for linear systems in higher dimensions
- MCMC methods for functions: modifying old algorithms to make them faster
- Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
- Model Reduction for Large-Scale Systems with High-Dimensional Parametric Input Space
- MCMC METHODS FOR DIFFUSION BRIDGES
- Adaptive multi-fidelity polynomial chaos approach to Bayesian inference in inverse problems
- Sparse deterministic approximation of Bayesian inverse problems
- Nonlinear model reduction for uncertainty quantification in large-scale inverse problems
- Nonlinear reduced models for state and parameter estimation
- Besov priors for Bayesian inverse problems
- Bayesian inference for differential equations
- A computational framework for infinite-dimensional Bayesian inverse problems. II: stochastic Newton MCMC with application to ice sheet flow inverse problems
- Data-driven model reduction for the Bayesian solution of inverse problems
- A continuous analogue of the tensor-train decomposition
- Spectral tensor-train decomposition
- Scalable posterior approximations for large-scale Bayesian inverse problems via likelihood-informed parameter and state reduction
- Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data
- Approximation and sampling of multivariate probability distributions in the tensor train decomposition
- Sampling-free Bayesian inversion with adaptive hierarchical tensor representations
- A hybrid alternating least squares-TT-cross algorithm for parametric PDEs
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification
- A Family of Nonparametric Density Estimation Algorithms
- Sparse-grid, reduced-basis Bayesian inversion
- Transport map accelerated Markov chain Monte Carlo
- Multifidelity Dimension Reduction via Active Subspaces
- An adaptive surrogate modeling based on deep neural networks for large-scale Bayesian inverse problems
- Deep composition of tensor-trains using squared inverse Rosenblatt transports
- Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion
Cited In (8)
- Learning to solve Bayesian inverse problems: an amortized variational inference approach using Gaussian and flow guides
- On the representation and learning of monotone triangular transport maps
- Conditional sampling with monotone GANs: from generative models to likelihood-free inference
- Principal feature detection via \(\phi \)-Sobolev inequalities
- Optimal experimental design: formulations and computations
- Mini-workshop: Nonlinear approximation of high-dimensional functions in scientific computing. Abstracts from the mini-workshop held October 15--20, 2023
- Deep Importance Sampling Using Tensor Trains with Application to a Priori and a Posteriori Rare Events
- A low-rank solver for parameter estimation and uncertainty quantification in time-dependent systems of partial differential equations
This page was built for publication: Scalable conditional deep inverse Rosenblatt transports using tensor trains and gradient-based dimension reduction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6158090)