Gradient-based dimension reduction of multivariate vector-valued functions
From MaRDI portal
Publication:5220403
Abstract: Multivariate functions encountered in high-dimensional uncertainty quantification problems often vary most strongly along a few dominant directions in the input parameter space. We propose a gradient-based method for detecting these directions and using them to construct ridge approximations of such functions, in the case where the functions are vector-valued (e.g., taking values in ). The methodology consists of minimizing an upper bound on the approximation error, obtained by subspace Poincar'e inequalities. We provide a thorough mathematical analysis in the case where the parameter space is equipped with a Gaussian probability measure. The resulting method generalizes the notion of active subspaces associated with scalar-valued functions. A numerical illustration shows that using gradients of the function yields effective dimension reduction. We also show how the choice of norm on the codomain of the function has an impact on the function's low-dimensional approximation.
Recommendations
- Symmetry results for decay solutions of elliptic systems in the whole space
- Active subspace methods in theory and practice: applications to kriging surfaces
- Gaussian processes with built-in dimensionality reduction: applications to high-dimensional uncertainty propagation
- Gradient free active subspace construction using Morris screening elementary effects
- Dimension reduction via Gaussian ridge functions
Cites work
- scientific article; zbMATH DE number 1220060 (Why is no real title available?)
- scientific article; zbMATH DE number 1392848 (Why is no real title available?)
- scientific article; zbMATH DE number 1425054 (Why is no real title available?)
- Active subspaces. Emerging ideas for dimension reduction in parameter studies
- An adaptive sparse grid algorithm for elliptic PDEs with lognormal diffusion coefficient
- An approximation theoretic perspective of Sobol' indices with dependent variables
- An inequality for the multivariate normal distribution
- Asymptotics for pooled marginal slicing estimator based on SIR\(_\alpha\) approach
- Capturing ridge functions in high dimensions from point queries
- Certified dimension reduction in nonlinear Bayesian inverse problems
- Comment
- Concentration inequalities. A nonasymptotic theory of independence
- Derivative based global sensitivity measures and their link with global sensitivity indices
- Derivative-Based Global Sensitivity Measures and Their Link with Sobol’ Sensitivity Indices
- Derivative-based global sensitivity measures: general links with Sobol' indices and numerical tests
- Dimension Reduction for Multivariate Response Data
- Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
- Entropy and sampling numbers of classes of ridge functions
- Erratum: Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces
- Exploring Regression Structure Using Nonparametric Functional Estimation
- Finite element error analysis of elliptic PDEs with random coefficients and its application to multilevel Monte Carlo methods
- Finite elements for elliptic problems with stochastic coefficients
- Global sensitivity analysis: The primer
- Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates
- Karhunen-Loève approximation of random fields by generalized fast multipole methods
- Learning functions of few arbitrary linear parameters in high dimensions
- On Nonlinear Functions of Linear Combinations
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- On dimension reduction in regressions with multivariate responses
- Poincaré inequalities on intervals -- application to sensitivity analysis
- Principal component analysis.
- Projection pursuit
- Ridge functions
- Second order Poincaré inequalities and CLTs on Wiener space
- Sensitivity Analysis in Practice
- Sensitivity analysis for multidimensional and functional outputs
- Sensitivity indices for multivariate outputs
- Sliced Inverse Regression for Dimension Reduction
- Some extensions of multivariate sliced inverse regression
- Sufficient dimension reduction and prediction in regression
- Theory and practice of finite elements.
- Time‐dependent global sensitivity analysis with active subspaces for a lithium ion battery model
- Uncertainty management in simulation-optimization of complex systems. Algorithms and applications
Cited in
(33)- Modified Active Subspaces Using the Average of Gradients
- Dimension reduction via Gaussian ridge functions
- Multi‐fidelity data fusion through parameter space reduction with applications to automotive engineering
- Reduced basis methods for time-dependent problems
- Certified dimension reduction in nonlinear Bayesian inverse problems
- Principal feature detection via \(\phi \)-Sobolev inequalities
- Optimal representation of multivariate functions or data in visualizable low-dimensional spaces
- Embedded ridge approximations
- Derivative-Based Global Sensitivity Analysis for Models with High-Dimensional Inputs and Functional Outputs
- Model Reduction for Nonlinear Systems by Balanced Truncation of State and Gradient Covariance
- Data-free likelihood-informed dimension reduction of Bayesian inverse problems
- A Supervised Learning Approach Involving Active Subspaces for an Efficient Genetic Algorithm in High-Dimensional Optimization Problems
- Preintegration via Active Subspace
- Nonlinear dimension reduction for surrogate modeling using gradient information
- Active subspace methods in theory and practice: applications to kriging surfaces
- Gaussian quadrature and polynomial approximation for one-dimensional ridge functions
- Symmetry results for decay solutions of elliptic systems in the whole space
- Derivative-informed neural operator: an efficient framework for high-dimensional parametric derivative learning
- Gradient-free construction of active subspaces for dimension reduction in complex models with applications to neutronics
- Prior normalization for certified likelihood-informed subspace detection of Bayesian inverse problems
- Derivative-informed projected neural networks for high-dimensional parametric maps governed by PDEs
- Kernel‐based active subspaces with application to computational fluid dynamics parametric problems using the discontinuous Galerkin method
- Generalized bounds for active subspaces
- Learning high-dimensional parametric maps via reduced basis adaptive residual networks
- Characterization of flow through random media via Karhunen-Loève expansion: an information theory perspective
- A distributed active subspace method for scalable surrogate modeling of function valued outputs
- On the Deep Active-Subspace Method
- Large-scale Bayesian optimal experimental design with derivative-informed projected neural network
- A local approach to parameter space reduction for regression and classification tasks
- Gradient free active subspace construction using Morris screening elementary effects
- Multifidelity Dimension Reduction via Active Subspaces
- An efficient dimension reduction for the Gaussian process emulation of two nested codes with functional outputs
- Structure exploiting methods for fast uncertainty quantification in multiphase flow through heterogeneous media
This page was built for publication: Gradient-based dimension reduction of multivariate vector-valued functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5220403)