Scalable conditional deep inverse Rosenblatt transports using tensor trains and gradient-based dimension reduction
From MaRDI portal
Publication:6158090
Abstract: We present a novel offline-online method to mitigate the computational burden of the characterization of posterior random variables in statistical learning. In the offline phase, the proposed method learns the joint law of the parameter random variables and the observable random variables in the tensor-train (TT) format. In the online phase, the resulting order-preserving conditional transport can characterize the posterior random variables given newly observed data in real time. Compared with the state-of-the-art normalizing flow techniques, the proposed method relies on function approximation and is equipped with a thorough performance analysis. The function approximation perspective also allows us to further extend the capability of transport maps in challenging problems with high-dimensional observations and high-dimensional parameters. On the one hand, we present novel heuristics to reorder and/or reparametrize the variables to enhance the approximation power of TT. On the other hand, we integrate the TT-based transport maps and the parameter reordering/reparametrization into layered compositions to further improve the performance of the resulting transport maps. We demonstrate the efficiency of the proposed method on various statistical learning tasks in ordinary differential equations (ODEs) and partial differential equations (PDEs).
Recommendations
- Deep composition of tensor-trains using squared inverse Rosenblatt transports
- Data-free likelihood-informed dimension reduction of Bayesian inverse problems
- Stein variational gradient descent on infinite-dimensional space and applications to statistical inverse problems
- Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion
- A distributed framework for the construction of transport maps
Cites work
- scientific article; zbMATH DE number 7370574 (Why is no real title available?)
- A Family of Nonparametric Density Estimation Algorithms
- A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data
- A computational framework for infinite-dimensional Bayesian inverse problems. II: stochastic Newton MCMC with application to ice sheet flow inverse problems
- A continuous analogue of the tensor-train decomposition
- A hybrid alternating least squares-TT-cross algorithm for parametric PDEs
- A note on an inequality involving the normal distribution
- A sequential particle filter method for static models
- Adaptive multi-fidelity polynomial chaos approach to Bayesian inference in inverse problems
- Alternating minimal energy methods for linear systems in higher dimensions
- An adaptive surrogate modeling based on deep neural networks for large-scale Bayesian inverse problems
- Approximation and sampling of multivariate probability distributions in the tensor train decomposition
- Bayesian inference for differential equations
- Besov priors for Bayesian inverse problems
- CUR matrix decompositions for improved data analysis
- Convergence rates of best \(N\)-term Galerkin approximations for a class of elliptic SPDEs
- Data-driven model reduction for the Bayesian solution of inverse problems
- Data-driven optimal transport
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification
- Deep composition of tensor-trains using squared inverse Rosenblatt transports
- Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
- Discretization-invariant Bayesian inversion and Besov space priors
- Generalization of an inequality by Talagrand and links with the logarithmic Sobolev inequality
- Handbook of Markov Chain Monte Carlo
- Inverse problems: a Bayesian perspective
- Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion
- MCMC METHODS FOR DIFFUSION BRIDGES
- MCMC methods for functions: modifying old algorithms to make them faster
- Model Reduction for Large-Scale Systems with High-Dimensional Parametric Input Space
- Monte Carlo strategies in scientific computing.
- Multifidelity Dimension Reduction via Active Subspaces
- Nonlinear model reduction for uncertainty quantification in large-scale inverse problems
- Nonlinear reduced models for state and parameter estimation
- Parameter Estimation for Differential Equations: a Generalized Smoothing Approach
- Parameter and state model reduction for large-scale statistical inverse problems
- Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data
- Pseudo-skeleton approximations by matrices of maximal volume
- Remarks on a Multivariate Transformation
- Sampling-free Bayesian inversion with adaptive hierarchical tensor representations
- Scalable posterior approximations for large-scale Bayesian inverse problems via likelihood-informed parameter and state reduction
- Sequential Monte Carlo Samplers
- Simulating normalizing constants: From importance sampling to bridge sampling to path sampling
- Sparse deterministic approximation of Bayesian inverse problems
- Sparse-grid, reduced-basis Bayesian inversion
- Spectral tensor-train decomposition
- TT-cross approximation for multidimensional arrays
- Tensor Decompositions and Applications
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- The no-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
- Transport map accelerated Markov chain Monte Carlo
Cited in
(8)- Principal feature detection via \(\phi \)-Sobolev inequalities
- A low-rank solver for parameter estimation and uncertainty quantification in time-dependent systems of partial differential equations
- Mini-workshop: Nonlinear approximation of high-dimensional functions in scientific computing. Abstracts from the mini-workshop held October 15--20, 2023
- On the representation and learning of monotone triangular transport maps
- Optimal experimental design: formulations and computations
- Learning to solve Bayesian inverse problems: an amortized variational inference approach using Gaussian and flow guides
- Deep Importance Sampling Using Tensor Trains with Application to a Priori and a Posteriori Rare Events
- Conditional sampling with monotone GANs: from generative models to likelihood-free inference
This page was built for publication: Scalable conditional deep inverse Rosenblatt transports using tensor trains and gradient-based dimension reduction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6158090)