Gaussian processes with built-in dimensionality reduction: applications to high-dimensional uncertainty propagation
DOI10.1016/J.JCP.2016.05.039zbMATH Open1349.65049arXiv1602.04550OpenAlexW2282795067MaRDI QIDQ726924FDOQ726924
Authors: I. Bilionis, Marcial Gonzalez, Rohit K. Tripathy
Publication date: 5 December 2016
Published in: Journal of Computational Physics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1602.04550
Recommendations
- Structured Bayesian Gaussian process latent variable model: applications to data-driven dimensionality reduction and high-dimensional inversion
- Some structural approximations for efficient probability propagation in evolving high dimensional Gaussian processes
- Gaussian Process Subspace Prediction for Model Reduction
- scientific article; zbMATH DE number 773968
- Multi-output separable Gaussian process: towards an efficient, fully Bayesian paradigm for uncertainty quantification
- Uncertainty quantification using the nearest neighbor Gaussian process
- Enhanced Gaussian processes and applications
- A scalable approximate Bayesian inference for high-dimensional Gaussian processes
dimensionality reductionGaussian process regressionuncertainty quantificationStiefel manifoldactive subspacegranular crystals
Bayesian inference (62F15) Classification and discrimination; cluster analysis (statistical aspects) (62H30) General nonlinear regression (62J02)
Cites Work
- Efficient global optimization of expensive black-box functions
- A feasible method for optimization with orthogonality constraints
- Title not available (Why is that?)
- Reversible jump Markov chain Monte Carlo computation and Bayesian model determination
- Design and analysis of computer experiments. With comments and a rejoinder by the authors
- Title not available (Why is that?)
- Title not available (Why is that?)
- A Limited Memory Algorithm for Bound Constrained Optimization
- Pattern recognition and machine learning.
- Title not available (Why is that?)
- A taxonomy of global optimization methods based on response surfaces
- Machine learning. A probabilistic perspective
- Monte Carlo sampling methods using Markov chains and their applications
- Monte Carlo strategies in scientific computing
- Equation of State Calculations by Fast Computing Machines
- Introduction to Stochastic Search and Optimization
- Riemann Manifold Langevin and Hamiltonian Monte Carlo Methods
- Probabilistic Sensitivity Analysis of Complex Models: A Bayesian Approach
- Bayes-Hermite quadrature
- Evaluating Derivatives
- Probability Theory
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- Quasi-Monte Carlo Finite Element Methods for a Class of Elliptic Partial Differential Equations with Random Coefficients
- Global sensitivity analysis: The primer
- A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data
- Multi-level Monte Carlo finite element method for elliptic PDEs with stochastic coefficients
- Uncertainty quantification. Theory, implementation, and applications
- High-Order Collocation Methods for Differential Equations with Random Inputs
- Constrained global optimization of expensive black box functions using radial basis functions
- A radial basis function method for global optimization
- An adaptive hierarchical sparse grid collocation algorithm for the solution of stochastic differential equations
- Bayesian nonparametric modeling for functional analysis of variance
- General foundations of high-dimensional model representations
- High‐dimensional model representation for structural reliability analysis
- Bayesian learning for neural networks
- Quasi-Monte Carlo integration
- Evolutionary Multi-Criterion Optimization
- Efficient implementation of high dimensional model representations
- High dimensional model representations generated from low dimensional data samples. I: mp-cut-HDMR
- Efficient collocational approach for parametric uncertainty analysis
- Multi-output local Gaussian process regression: applications to uncertainty quantification
- An adaptive high-dimensional stochastic model representation technique for the solution of stochastic partial differential equations
- Multi-output separable Gaussian process: towards an efficient, fully Bayesian paradigm for uncertainty quantification
- Title not available (Why is that?)
- Kernel principal component analysis for stochastic input model generation
- Free energy computations by minimization of Kullback-Leibler divergence: An efficient adaptive biasing potential method for sparse representations
- A nonlocal contact formulation for confined granular systems
- Bayesian functional ANOVA modeling using Gaussian process prior distributions
- Additive covariance kernels for high-dimensional Gaussian process modeling
- DYNAMIC PROGRAMMING AND LAGRANGE MULTIPLIERS
- Multidimensional Adaptive Relevance Vector Machines for Uncertainty Quantification
- ANALYSIS OF VARIANCE-BASED MIXED MULTISCALE FINITE ELEMENT METHOD AND APPLICATIONS IN STOCHASTIC TWO-PHASE FLOWS
- Erratum: Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces
- An Adaptive ANOVA-Based Data-Driven Stochastic Method for Elliptic PDEs with Random Coefficient
- Solution of inverse problems with limited forward solver evaluations: a Bayesian perspective
- Uncertainty propagation using infinite mixture of gaussian processes and variational Bayesian inference
Cited In (45)
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification
- A surrogate assisted adaptive framework for robust topology optimization
- Model order reduction for large-scale structures with local nonlinearities
- Systems of Gaussian process models for directed chains of solvers
- Reduced-load equivalence for Gaussian processes
- Data-Driven Polynomial Ridge Approximation Using Variable Projection
- Learning to solve Bayesian inverse problems: an amortized variational inference approach using Gaussian and flow guides
- Sequential Learning of Active Subspaces
- Compressive sensing adaptation for polynomial chaos expansions
- Efficient reduced-rank methods for Gaussian processes with eigenfunction expansions
- Collaborative and adaptive Bayesian optimization for bounding variances and probabilities under hybrid uncertainties
- High-Dimensional Nonlinear Multi-Fidelity Model with Gradient-Free Active Subspace Method
- Optimal design for kernel interpolation: applications to uncertainty quantification
- Gaussian Quadrature and Polynomial Approximation for One-Dimensional Ridge Functions
- Simulator-free solution of high-dimensional stochastic elliptic partial differential equations using deep neural networks
- Clustered active-subspace based local Gaussian process emulator for high-dimensional and complex computer models
- A generalized active subspace for dimension reduction in mixed aleatory-epistemic uncertainty quantification
- A sample-efficient deep learning method for multivariate uncertainty qualification of acoustic-vibration interaction problems
- Accelerated scale bridging with sparsely approximated Gaussian learning
- Surrogate modeling for high dimensional uncertainty propagation via deep kernel polynomial chaos expansion
- Bounds optimization of model response moments: a twin-engine Bayesian active learning method
- Bayesian adaptation of chaos representations using variational inference and sampling on geodesics
- Dimension Reduction for Gaussian Process Emulation: An Application to the Influence of Bathymetry on Tsunami Heights
- Polynomial chaos expansions on principal geodesic Grassmannian submanifolds for surrogate modeling and uncertainty quantification
- Symmetry results for decay solutions of elliptic systems in the whole space
- Dimension Reduction via Gaussian Ridge Functions
- Kernel‐based active subspaces with application to computational fluid dynamics parametric problems using the discontinuous Galerkin method
- A novel active learning method based on matrix-operation RBF model for high-dimensional reliability analysis
- Variational Bayesian surrogate modelling with application to robust design optimisation
- Generalized bounds for active subspaces
- Hilbert space methods for reduced-rank Gaussian process regression
- A survey of unsupervised learning methods for high-dimensional uncertainty quantification in black-box-type problems
- Hierarchical surrogate model with dimensionality reduction technique for high-dimensional uncertainty propagation
- ANOVA Gaussian process modeling for high-dimensional stochastic computational models
- On the influence of over-parameterization in manifold based surrogates and deep neural operators
- On the Deep Active-Subspace Method
- Surrogate assisted active subspace and active subspace assisted surrogate -- a new paradigm for high dimensional structural reliability analysis
- Surrogate modeling of high-dimensional problems via data-driven polynomial chaos expansions and sparse partial least square
- Bayesian neural networks for predicting uncertainty in full-field material response
- A review of recent advances in surrogate models for uncertainty quantification of high-dimensional engineering applications
- Gaussian Process Subspace Prediction for Model Reduction
- A near-stationary subspace for ridge approximation
- Bayesian learning of orthogonal embeddings for multi-fidelity Gaussian processes
- An improved sufficient dimension reduction-based kriging modeling method for high-dimensional evaluation-expensive problems
- Multi-fidelity cost-aware Bayesian optimization
Uses Software
This page was built for publication: Gaussian processes with built-in dimensionality reduction: applications to high-dimensional uncertainty propagation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q726924)