Nonlinear level set learning for function approximation on sparse data with applications to parametric differential equations

From MaRDI portal
Publication:5864754

DOI10.4208/NMTMA.OA-2021-0062zbMATH Open1499.65052arXiv2104.14072OpenAlexW3200632266WikidataQ115209276 ScholiaQ115209276MaRDI QIDQ5864754FDOQ5864754


Authors: Anthony Gruber, Max Gunzburger, Lili Ju, Yuankai Teng, Zhu Wang Edit this on Wikidata


Publication date: 8 June 2022

Published in: Numerical Mathematics: Theory, Methods and Applications (Search for Journal in Brave)

Abstract: A dimension reduction method based on the "Nonlinear Level set Learning" (NLL) approach is presented for the pointwise prediction of functions which have been sparsely sampled. Leveraging geometric information provided by the Implicit Function Theorem, the proposed algorithm effectively reduces the input dimension to the theoretical lower bound with minor accuracy loss, providing a one-dimensional representation of the function which can be used for regression and sensitivity analysis. Experiments and applications are presented which compare this modified NLL with the original NLL and the Active Subspaces (AS) method. While accommodating sparse input data, the proposed algorithm is shown to train quickly and provide a much more accurate and informative reduction than either AS or the original NLL on two example functions with high-dimensional domains, as well as two state-dependent quantities depending on the solutions to parametric differential equations.


Full work available at URL: https://arxiv.org/abs/2104.14072




Recommendations




Cites Work


Cited In (2)

Uses Software





This page was built for publication: Nonlinear level set learning for function approximation on sparse data with applications to parametric differential equations

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5864754)