Nonlinear level set learning for function approximation on sparse data with applications to parametric differential equations
From MaRDI portal
Publication:5864754
Abstract: A dimension reduction method based on the "Nonlinear Level set Learning" (NLL) approach is presented for the pointwise prediction of functions which have been sparsely sampled. Leveraging geometric information provided by the Implicit Function Theorem, the proposed algorithm effectively reduces the input dimension to the theoretical lower bound with minor accuracy loss, providing a one-dimensional representation of the function which can be used for regression and sensitivity analysis. Experiments and applications are presented which compare this modified NLL with the original NLL and the Active Subspaces (AS) method. While accommodating sparse input data, the proposed algorithm is shown to train quickly and provide a much more accurate and informative reduction than either AS or the original NLL on two example functions with high-dimensional domains, as well as two state-dependent quantities depending on the solutions to parametric differential equations.
Recommendations
- Level Set Learning with Pseudoreversible Neural Networks for Nonlinear Dimension Reduction in Function Approximation
- Nonlinear dimension reduction for surrogate modeling using gradient information
- Nonintrusive Reduced-Order Models for Parametric Partial Differential Equations via Data-Driven Operator Inference
- Symmetry results for decay solutions of elliptic systems in the whole space
- A deep learning approach to Reduced Order Modelling of parameter dependent partial differential equations
Cites work
- scientific article; zbMATH DE number 1404611 (Why is no real title available?)
- A modified SEIR model for the spread of Ebola in western Africa and metrics for resource allocation
- A near-stationary subspace for ridge approximation
- A review on dimension reduction
- Active subspace methods in theory and practice: applications to kriging surfaces
- Active subspaces. Emerging ideas for dimension reduction in parameter studies
- Approximation by Ridge functions and neural networks with one hidden layer
- Boundary regularity and the Dirichlet problem for harmonic maps
- Fisher lecture: Dimension reduction in regression
- Multifidelity Dimension Reduction via Active Subspaces
- Multilayer feedforward networks are universal approximators
- On finite population sampling theory under certain linear regression models
- Parallel transport unfolding: a connection-based manifold learning approach
- Reducing the Dimensionality of Data with Neural Networks
- Response surface methodology. Process and product optimization using designed experiments
- Robust kernel Isomap
- Sampling Algorithms and Coresets for $\ell_p$ Regression
- Sparse sufficient dimension reduction
- Sufficient dimension reduction and prediction in regression
Cited in
(2)
This page was built for publication: Nonlinear level set learning for function approximation on sparse data with applications to parametric differential equations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5864754)