Gradient-based dimension reduction of multivariate vector-valued functions

From MaRDI portal
Publication:5220403

DOI10.1137/18M1221837zbMATH Open1433.41007arXiv1801.07922MaRDI QIDQ5220403FDOQ5220403


Authors: Olivier Zahm, P. Constantine, Clémentine Prieur, Youssef M. Marzouk Edit this on Wikidata


Publication date: 20 March 2020

Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)

Abstract: Multivariate functions encountered in high-dimensional uncertainty quantification problems often vary most strongly along a few dominant directions in the input parameter space. We propose a gradient-based method for detecting these directions and using them to construct ridge approximations of such functions, in the case where the functions are vector-valued (e.g., taking values in mathbbRn). The methodology consists of minimizing an upper bound on the approximation error, obtained by subspace Poincar'e inequalities. We provide a thorough mathematical analysis in the case where the parameter space is equipped with a Gaussian probability measure. The resulting method generalizes the notion of active subspaces associated with scalar-valued functions. A numerical illustration shows that using gradients of the function yields effective dimension reduction. We also show how the choice of norm on the codomain of the function has an impact on the function's low-dimensional approximation.


Full work available at URL: https://arxiv.org/abs/1801.07922




Recommendations




Cites Work


Cited In (28)





This page was built for publication: Gradient-based dimension reduction of multivariate vector-valued functions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5220403)