Level Set Learning with Pseudoreversible Neural Networks for Nonlinear Dimension Reduction in Function Approximation
DOI10.1137/21M1459198zbMATH Open1515.65047arXiv2112.01438OpenAlexW4379384852MaRDI QIDQ6155903FDOQ6155903
Authors: Yuankai Teng, Zhu Wang, Lili Ju, Anthony Gruber, Guannan Zhang
Publication date: 7 June 2023
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2112.01438
Recommendations
- Nonlinear level set learning for function approximation on sparse data with applications to parametric differential equations
- Nonlinear dimension reduction for surrogate modeling using gradient information
- Learning high-dimensional parametric maps via reduced basis adaptive residual networks
- Model reduction and neural networks for parametric PDEs
- Non-intrusive model reduction of large-scale, nonlinear dynamical systems using deep learning
dimension reductionsparse datafunction approximationlevel set learningpseudoreversible neural networksynthesized regression
Computer science aspects of computer-aided design (68U07) Algorithms for approximation of functions (65D15) Numerical approximation of high-dimensional functions; sparse grids (65D40)
Cites Work
- Sliced Inverse Regression for Dimension Reduction
- Comment
- Updating Quasi-Newton Matrices with Limited Storage
- Deep learning
- Sufficient dimension reduction and prediction in regression
- Sufficient Dimension Reduction via Inverse Regression
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- Orthogonal Array-Based Latin Hypercubes
- Save: a method for dimension reduction and graphics in regression
- The Mathematical Theory of Finite Element Methods
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- Ridge functions
- A general theory for nonlinear sufficient dimension reduction: formulation and estimation
- Active subspaces. Emerging ideas for dimension reduction in parameter studies
- Active subspace methods in theory and practice: applications to kriging surfaces
- Nonlinear sufficient dimension reduction for functional data
- Simulator-free solution of high-dimensional stochastic elliptic partial differential equations using deep neural networks
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification
- Sufficient dimension reduction: methods and applications with R
- Constructing least-squares polynomial approximations
- Nonlinear level set learning for function approximation on sparse data with applications to parametric differential equations
Cited In (3)
- Multilevel Fine-Tuning: Closing Generalization Gaps in Approximation of Solution Maps under a Limited Budget for Training Data
- Nonlinear level set learning for function approximation on sparse data with applications to parametric differential equations
- Improving the expressive power of deep neural networks through integral activation transform
This page was built for publication: Level Set Learning with Pseudoreversible Neural Networks for Nonlinear Dimension Reduction in Function Approximation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6155903)