Low-rank separated representation surrogates of high-dimensional stochastic functions: application in Bayesian inference
From MaRDI portal
Publication:348754
DOI10.1016/J.JCP.2013.12.024zbMATH Open1349.65052arXiv1306.5374OpenAlexW2032522532MaRDI QIDQ348754FDOQ348754
Authors: AbdoulAhad Validi
Publication date: 5 December 2016
Published in: Journal of Computational Physics (Search for Journal in Brave)
Abstract: This study introduces a non-intrusive approach in the context of low-rank separated representation to construct a surrogate of high-dimensional stochastic functions, e.g., PDEs/ODEs, in order to decrease the computational cost of Markov Chain Monte Carlo simulations in Bayesian inference. The surrogate model is constructed via a regularized alternative least-square regression with Tikhonov regularization using a roughening matrix computing the gradient of the solution, in conjunction with a perturbation-based error indicator to detect optimal model complexities. The model approximates a vector of a continuous solution at discrete values of a physical variable. The required number of random realizations to achieve a successful approximation linearly depends on the function dimensionality. The computational cost of the model construction is quadratic in the number of random inputs, which potentially tackles the curse of dimensionality in high-dimensional stochastic functions. Furthermore, this vector valued separated representation-based model, in comparison to the available scalar-valued case, leads to a significant reduction in the cost of approximation by an order of magnitude equal to the vector size. The performance of the method is studied through its application to three numerical examples including a 41-dimensional elliptic PDE and a 21-dimensional cavity flow.
Full work available at URL: https://arxiv.org/abs/1306.5374
Recommendations
- Non-intrusive low-rank separated approximation of high-dimensional stochastic models
- An adaptive reduced basis ANOVA method for high-dimensional Bayesian inverse problems
- Surrogate modeling of high-dimensional problems via data-driven polynomial chaos expansions and sparse partial least square
- Convergence analysis of surrogate-based methods for Bayesian inverse problems
- Non-intrusive tensor reconstruction for high-dimensional random PDEs
inverse problemuncertainty quantificationseparated representationbayesian inferencehigh-dimensional PDE/ODE
Cites Work
- Convergence Properties of the Nelder--Mead Simplex Method in Low Dimensions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Title not available (Why is that?)
- Markov chains for exploring posterior distributions. (With discussion)
- An adaptive Metropolis algorithm
- Delayed rejection in reversible jump Metropolis-Hastings.
- Title not available (Why is that?)
- Tensor Decompositions and Applications
- Bayesian Inference in Econometric Models Using Monte Carlo Integration
- Equation of State Calculations by Fast Computing Machines
- Adaptive proposal distribution for random walk Metropolis algorithm
- Numerical operator calculus in higher dimensions
- Algorithms for Numerical Analysis in High Dimensions
- Nonstationary inverse problems and state estimation
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Computational Methods for Inverse Problems
- Principal component analysis of three-mode data by means of alternating least squares algorithms
- Efficient Monte Carlo Procedures for Generating Points Uniformly Distributed over Bounded Regions
- Bayesian inference with optimal maps
- An adaptive multi-element generalized polynomial chaos method for stochastic differential equations
- Spectral Methods for Uncertainty Quantification
- Markov Chains
- Inverse Problem Theory and Methods for Model Parameter Estimation
- Title not available (Why is that?)
- Discrete inverse problems. Insight and algorithms.
- An introduction to the mathematical theory of inverse problems
- Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
- Ordering and improving the performance of Monte Carlo Markov chains.
- Linear models. Least squares and alternatives.
- An efficient Bayesian inference approach to inverse problems based on an adaptive sparse grid collocation method
- Title not available (Why is that?)
- Non-intrusive low-rank separated approximation of high-dimensional stochastic models
- A new family of solvers for some classes of multidimensional partial differential equations encountered in kinetic theory modeling of complex fluids
- Multivariate regression and machine learning with sums of separable functions
- Stochastic model reduction for chaos representations
- Bayesian analysis of dichotomous quantal response models
- Learning to Predict Physical Properties using Sums of Separable Functions
- Title not available (Why is that?)
- Multiparameter Univariate Bayesian Analysis
- Title not available (Why is that?)
- Markov Chain Monte Carlo Methods for High Dimensional Inversion in Remote Sensing
- When did Bayesian inference become ``Bayesian?
Cited In (5)
- Sparse low-rank separated representation models for learning from data
- Polynomial meta-models with canonical low-rank approximations: numerical insights and comparison to sparse polynomial chaos expansions
- The Optimization Landscape for Fitting a Rank-2 Tensor with a Rank-1 Tensor
- Title not available (Why is that?)
- Stochastic Collocation Algorithms Using $l_1$-Minimization for Bayesian Solution of Inverse Problems
Uses Software
This page was built for publication: Low-rank separated representation surrogates of high-dimensional stochastic functions: application in Bayesian inference
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q348754)