Data-driven polynomial ridge approximation using variable projection
From MaRDI portal
Publication:5745137
DOI10.1137/17M1117690zbMATH Open1392.49034arXiv1702.05859MaRDI QIDQ5745137FDOQ5745137
Authors: Jeffrey M. Hokanson, P. Constantine
Publication date: 5 June 2018
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Abstract: Inexpensive surrogates are useful for reducing the cost of science and engineering studies involving large-scale, complex computational models with many input parameters. A ridge approximation is one class of surrogate that models a quantity of interest as a nonlinear function of a few linear combinations of the input parameters. When used in parameter studies (e.g., optimization or uncertainty quantification), ridge approximations allow the low dimensional structure to be exploited, reducing the effective dimension. We introduce a new, fast algorithm for constructing a ridge approximation where the nonlinear function is a polynomial. This polynomial ridge approximation is chosen to minimize least squared mismatch between the surrogate and the quantity of interest on a given set of inputs. Naively, this would require optimizing both the polynomial coefficients and the linear combination of weights; the latter of which define a low-dimensional subspace of the input space. However, given a fixed subspace the optimal polynomial can be found by solving a linear least-squares problem, and hence by using variable projection the polynomial can be implicitly found leaving an optimization problem over the subspace alone. We provide an algorithm that finds this polynomial ridge approximation by minimizing over the Grassmann manifold of low-dimensional subspaces using a Gauss-Newton method. We provide details of this optimization algorithm and demonstrate its performance on several numerical examples. Our Gauss-Newton method has superior theoretical guarantees and faster convergence than the alternating approach for polynomial ridge approximation earlier proposed by Constantine, Eftekhari, Hokanson, and Ward [1] that alternates between (i) optimizing the polynomial coefficients given the subspace and (ii) optimizing the subspace given the coefficients.
Full work available at URL: https://arxiv.org/abs/1702.05859
Recommendations
- A near-stationary subspace for ridge approximation
- Gaussian quadrature and polynomial approximation for one-dimensional ridge functions
- Dimension reduction via Gaussian ridge functions
- Linear/ridge expansions: enhancing linear approximations by ridge functions
- Inverse regression for ridge recovery: a data-driven approach for parameter reduction in computer experiments
General nonlinear regression (62J02) Methods of quasi-Newton type (90C53) Newton-type methods (49M15) Large-scale systems (93A15)
Cites Work
- Applied Linear Regression
- Pymanopt: a Python toolbox for optimization on manifolds using automatic differentiation
- Scikit-learn: machine learning in Python
- A new look at the statistical model identification
- The design and analysis of computer experiments.
- Gaussian processes for machine learning.
- Title not available (Why is that?)
- A taxonomy of global optimization methods based on response surfaces
- The Geometry of Algorithms with Orthogonality Constraints
- Dimension Reduction for Gaussian Process Emulation: An Application to the Influence of Bathymetry on Tsunami Heights
- An Adaptive Estimation of Dimension Reduction Space
- Title not available (Why is that?)
- Gaussian processes with built-in dimensionality reduction: applications to high-dimensional uncertainty propagation
- Title not available (Why is that?)
- Gradient-based kernel dimension reduction for regression
- The Differentiation of Pseudo-Inverses and Nonlinear Least Squares Problems Whose Variables Separate
- Sufficient dimension reduction and prediction in regression
- Projection pursuit
- A Multiple-Index Model and Dimension Reduction
- Learning functions of few arbitrary linear parameters in high dimensions
- Accuracy and Stability of Numerical Algorithms
- Learning non-parametric basis independent models from point queries via low-rank methods
- Title not available (Why is that?)
- DIFFERENTIAL GEOMETRY OF GRASSMANN MANIFOLDS
- Title not available (Why is that?)
- Ridge functions
- Estimating Computational Noise
- Cross-Validation of Regression Models
- Response surface methodology. Process and product optimization using designed experiments
- The Elements of Statistical Learning
- Compressive sampling of polynomial chaos expansions: convergence analysis and sampling strategies
- On Nonlinear Functions of Linear Combinations
- Survey of modeling and optimization strategies to solve high-dimensional design problems with computationally-expensive black-box functions
- Algorithms for Separable Nonlinear Least Squares Problems
- Capturing ridge functions in high dimensions from point queries
- Active subspaces. Emerging ideas for dimension reduction in parameter studies
- Discovering an active subspace in a single‐diode solar cell model
- Mathematical analysis and dynamic active subspaces for a long term model of HIV
- Active subspace methods in theory and practice: applications to kriging surfaces
- Exploiting active subspaces to quantify uncertainty in the numerical simulation of the hyshot II scramjet
- Introduction to uncertainty quantification
- How bad are Vandermonde matrices?
- A near-stationary subspace for ridge approximation
- Time‐dependent global sensitivity analysis with active subspaces for a lithium ion battery model
- Dimension reduction in magnetohydrodynamics power generation models: Dimensional analysis and active subspaces
Cited In (13)
- Sequential Learning of Active Subspaces
- Compressive sensing adaptation for polynomial chaos expansions
- Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions
- Improved multifidelity Monte Carlo estimators based on normalizing flows and dimensionality reduction techniques
- Artificial neural network based response surface for data-driven dimensional analysis
- Embedded ridge approximations
- A Lipschitz Matrix for Parameter Reduction in Computational Science
- Inverse regression for ridge recovery: a data-driven approach for parameter reduction in computer experiments
- Linear/ridge expansions: enhancing linear approximations by ridge functions
- Gaussian quadrature and polynomial approximation for one-dimensional ridge functions
- A machine learning approach to portfolio pricing and risk management for high‐dimensional problems
- Generalized bounds for active subspaces
- A near-stationary subspace for ridge approximation
Uses Software
This page was built for publication: Data-driven polynomial ridge approximation using variable projection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5745137)