Dimension reduction via Gaussian ridge functions
From MaRDI portal
Publication:4960976
Abstract: Ridge functions have recently emerged as a powerful set of ideas for subspace-based dimension reduction. In this paper we begin by drawing parallels between ridge subspaces, sufficient dimension reduction and active subspaces, contrasting between techniques rooted in statistical regression and those rooted in approximation theory. This sets the stage for our new algorithm that approximates what we call a Gaussian ridge function---the posterior mean of a Gaussian process on a dimension-reducing subspace---suitable for both regression and approximation problems. To compute this subspace we develop an iterative algorithm that optimizes over the Stiefel manifold to compute the subspace, followed by an optimization of the hyperparameters of the Gaussian process. We demonstrate the utility of the algorithm on two analytical functions, where we obtain near exact ridge recovery, and a turbomachinery case study, where we compare the efficacy of our approach with three well-known sufficient dimension reduction methods: SIR, SAVE, CR. The comparisons motivate the use of the posterior variance as a heuristic for identifying the suitability of a dimension-reducing subspace.
Recommendations
- A near-stationary subspace for ridge approximation
- Gradient-based dimension reduction of multivariate vector-valued functions
- Active subspace methods in theory and practice: applications to kriging surfaces
- Approximation of generalized ridge functions in high dimensions
- Inverse regression for ridge recovery: a data-driven approach for parameter reduction in computer experiments
Cites work
- scientific article; zbMATH DE number 1220060 (Why is no real title available?)
- scientific article; zbMATH DE number 6159604 (Why is no real title available?)
- scientific article; zbMATH DE number 5586093 (Why is no real title available?)
- scientific article; zbMATH DE number 5223994 (Why is no real title available?)
- A modified SEIR model for the spread of Ebola in western Africa and metrics for resource allocation
- A near-stationary subspace for ridge approximation
- A review on dimension reduction
- Active subspace methods in theory and practice: applications to kriging surfaces
- Active subspaces. Emerging ideas for dimension reduction in parameter studies
- Capturing ridge functions in high dimensions from point queries
- Dimension Reduction for Gaussian Process Emulation: An Application to the Influence of Bathymetry on Tsunami Heights
- Discovering an active subspace in a single‐diode solar cell model
- Exploring Regression Structure Using Nonparametric Functional Estimation
- Gaussian processes for machine learning (GPML) toolbox
- Gaussian processes for machine learning.
- Gaussian processes with built-in dimensionality reduction: applications to high-dimensional uncertainty propagation
- Inverse regression for ridge recovery: a data-driven approach for parameter reduction in computer experiments
- Learning functions of few arbitrary linear parameters in high dimensions
- Learning non-parametric basis independent models from point queries via low-rank methods
- Low-rank matrix completion via preconditioned optimization on the Grassmann manifold
- Manopt, a Matlab toolbox for optimization on manifolds
- On Directional Regression for Dimension Reduction
- Optimal Design of Experiments
- Optimization. Algorithms and consistent approximations
- Rejoinder: Fisher lecture: Dimension reduction in regression
- Ridge functions
- Save: a method for dimension reduction and graphics in regression
- Sliced Inverse Regression for Dimension Reduction
- Stability analysis of thermo-acoustic nonlinear eigenproblems in annular combustors. Part II. Uncertainty quantification
- Sufficient Dimension Reduction via Inverse Regression
- Sufficient dimension reduction and prediction in regression
- The Elements of Statistical Learning
- The Geometry of Algorithms with Orthogonality Constraints
Cited in
(11)- A Hierarchical Expected Improvement Method for Bayesian Optimization
- Efficient reduced-rank methods for Gaussian processes with eigenfunction expansions
- Data-driven polynomial ridge approximation using variable projection
- Gradient-based dimension reduction of multivariate vector-valued functions
- Embedded ridge approximations
- Inverse regression for ridge recovery: a data-driven approach for parameter reduction in computer experiments
- Gaussian quadrature and polynomial approximation for one-dimensional ridge functions
- Hilbert space methods for reduced-rank Gaussian process regression
- Hierarchical shrinkage Gaussian processes: applications to computer code emulation and dynamical system recovery
- Capturing ridge functions in high dimensions from point queries
- A near-stationary subspace for ridge approximation
This page was built for publication: Dimension reduction via Gaussian ridge functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4960976)