Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces
From MaRDI portal
Publication:2929996
DOI10.1137/130916138zbMath1311.65008arXiv1304.2070OpenAlexW3103869760MaRDI QIDQ2929996
No author found.
Publication date: 17 November 2014
Published in: (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1304.2070
No records found.
No records found.
Related Items (92)
Forward and backward uncertainty quantification with active subspaces: application to hypersonic flows around a cylinder ⋮ Preintegration via Active Subspace ⋮ Efficient derivative-free Bayesian inference for large-scale inverse problems ⋮ Clustered active-subspace based local Gaussian process emulator for high-dimensional and complex computer models ⋮ Artificial neural network based response surface for data-driven dimensional analysis ⋮ Exploiting active subspaces to quantify uncertainty in the numerical simulation of the hyshot II scramjet ⋮ Dimension-independent likelihood-informed MCMC ⋮ Parametric domain decomposition for accurate reduced order models: applications of MP-LROM methodology ⋮ Constructing Surrogate Models of Complex Systems with Enhanced Sparsity: Quantifying the Influence of Conformational Uncertainty in Biomolecular Solvation ⋮ Certified dimension reduction in nonlinear Bayesian inverse problems ⋮ Kriging surrogate model with coordinate transformation based on likelihood and gradient ⋮ A training set subsampling strategy for the reduced basis method ⋮ Generalized bounds for active subspaces ⋮ Surrogate modeling of high-dimensional problems via data-driven polynomial chaos expansions and sparse partial least square ⋮ Computational graph completion ⋮ A Probabilistic Subspace Bound with Application to Active Subspaces ⋮ Basis adaptation and domain decomposition for steady-state partial differential equations with random coefficients ⋮ Stability analysis of thermo-acoustic nonlinear eigenproblems in annular combustors. Part II. Uncertainty quantification ⋮ \(\boldsymbol{\mathcal{L}_2}\)-Optimal Reduced-Order Modeling Using Parameter-Separable Forms ⋮ Efficient parameter estimation for a methane hydrate model with active subspaces ⋮ Adaptive group Lasso neural network models for functions of few variables and time-dependent data ⋮ Deep capsule encoder–decoder network for surrogate modeling and uncertainty quantification ⋮ Kernel‐based active subspaces with application to computational fluid dynamics parametric problems using the discontinuous Galerkin method ⋮ Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions ⋮ Tensor rank reduction via coordinate flows ⋮ Multifidelity Dimension Reduction via Active Subspaces ⋮ On the Deep Active-Subspace Method ⋮ An improved sufficient dimension reduction-based kriging modeling method for high-dimensional evaluation-expensive problems ⋮ Projection pursuit adaptation on polynomial chaos expansions ⋮ Model-independent detection of new physics signals using interpretable semisupervised classifier tests ⋮ Data-Driven Kernel Designs for Optimized Greedy Schemes: A Machine Learning Perspective ⋮ Level Set Learning with Pseudoreversible Neural Networks for Nonlinear Dimension Reduction in Function Approximation ⋮ Posterior consistency for Gaussian process approximations of Bayesian posterior distributions ⋮ Optimal design of validation experiments for the prediction of quantities of interest ⋮ A novel and fully automated coordinate system transformation scheme for near optimal surrogate construction ⋮ Deep Importance Sampling Using Tensor Trains with Application to a Priori and a Posteriori Rare Events ⋮ Clustering dimensionless learning for multiple-physical-regime systems ⋮ Polynomial-chaos-based conditional statistics for probabilistic learning with heterogeneous data applied to atomic collisions of helium on graphite substrate ⋮ Derivative-informed neural operator: an efficient framework for high-dimensional parametric derivative learning ⋮ An efficient dimension reduction for the Gaussian process emulation of two nested codes with functional outputs ⋮ A Preconditioned Low-Rank Projection Method with a Rank-Reduction Scheme for Stochastic Partial Differential Equations ⋮ A distributed active subspace method for scalable surrogate modeling of function valued outputs ⋮ Compressive sensing adaptation for polynomial chaos expansions ⋮ Kriging-sparse polynomial dimensional decomposition surrogate model with adaptive refinement ⋮ PLS-based adaptation for efficient PCE representation in high dimensions ⋮ Adaptive sparse polynomial dimensional decomposition for derivative-based sensitivity ⋮ Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov chain Monte Carlo ⋮ Manifold learning for parameter reduction ⋮ A physics-aware, probabilistic machine learning framework for coarse-graining high-dimensional systems in the small data regime ⋮ Kriging-enhanced ensemble variational data assimilation for scalar-source identification in turbulent environments ⋮ Simulator-free solution of high-dimensional stochastic elliptic partial differential equations using deep neural networks ⋮ Transport Map Accelerated Adaptive Importance Sampling, and Application to Inverse Problems Arising from Multiscale Stochastic Reaction Networks ⋮ Dimension Reduction for Gaussian Process Emulation: An Application to the Influence of Bathymetry on Tsunami Heights ⋮ Goal-oriented adaptive surrogate construction for stochastic inversion ⋮ A data-driven framework for sparsity-enhanced surrogates with arbitrary mutually dependent randomness ⋮ Systems of Gaussian process models for directed chains of solvers ⋮ Accelerated basis adaptation in homogeneous chaos spaces ⋮ Bayesian learning of orthogonal embeddings for multi-fidelity Gaussian processes ⋮ Latent map Gaussian processes for mixed variable metamodeling ⋮ Gradient free active subspace construction using Morris screening elementary effects ⋮ Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier-Stokes simulations: a data-driven, physics-informed Bayesian approach ⋮ Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification ⋮ Sequential Learning of Active Subspaces ⋮ Parameter Selection and Verification Techniques Based on Global Sensitivity Analysis Illustrated for an HIV Model ⋮ Data-Driven Polynomial Ridge Approximation Using Variable Projection ⋮ A generalized active subspace for dimension reduction in mixed aleatory-epistemic uncertainty quantification ⋮ Embedded ridge approximations ⋮ Using automatic differentiation for compressive sensing in uncertainty quantification ⋮ Inverse regression-based uncertainty quantification algorithms for high-dimensional models: theory and practice ⋮ Enhancing sparsity of Hermite polynomial expansions by iterative rotations ⋮ Dimension Reduction via Gaussian Ridge Functions ⋮ On the choice of the low-dimensional domain for global optimization via random embeddings ⋮ Accelerating Markov Chain Monte Carlo with Active Subspaces ⋮ On the unsteady Darcy–Forchheimer–Brinkman equation in local and nonlocal tumor growth models ⋮ Derivative-informed projected neural networks for high-dimensional parametric maps governed by PDEs ⋮ Active-subspace analysis of exceedance probability for shallow-water waves ⋮ Divide and conquer: an incremental sparsity promoting compressive sampling approach for polynomial chaos expansions ⋮ A near-stationary subspace for ridge approximation ⋮ Gradient-Free Construction of Active Subspaces for Dimension Reduction in Complex Models with Applications to Neutronics ⋮ Sparse mixture models inspired by ANOVA decompositions ⋮ A multi-fidelity polynomial chaos-greedy Kaczmarz approach for resource-efficient uncertainty quantification on limited budget ⋮ Surrogate assisted active subspace and active subspace assisted surrogate -- a new paradigm for high dimensional structural reliability analysis ⋮ Unnamed Item ⋮ Gaussian Quadrature and Polynomial Approximation for One-Dimensional Ridge Functions ⋮ Learning constitutive relations from indirect observations using deep neural networks ⋮ A Supervised Learning Approach Involving Active Subspaces for an Efficient Genetic Algorithm in High-Dimensional Optimization Problems ⋮ Calibration of a SEIR-SEI epidemic model to describe the zika virus outbreak in Brazil ⋮ Data-free likelihood-informed dimension reduction of Bayesian inverse problems ⋮ Nonlinear Level Set Learning for Function Approximation on Sparse Data with Applications to Parametric Differential Equations ⋮ Generalization bounds for sparse random feature expansions ⋮ Robust and resource-efficient identification of two hidden layer neural networks ⋮ Exploring active subspace for neural network prediction of oscillating combustion
This page was built for publication: Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces