Data-driven surrogates for high dimensional models using Gaussian process regression on the Grassmann manifold
Publication:2020284
DOI10.1016/J.CMA.2020.113269zbMath1506.62549arXiv2003.11910OpenAlexW3013716343MaRDI QIDQ2020284
Publication date: 23 April 2021
Published in: Computer Methods in Applied Mechanics and Engineering (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2003.11910
interpolationGrassmann manifoldmachine learningspectral clusteringGaussian process regressionnonlinear projection
Statistics on manifolds (62R30) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Probabilistic models, generic numerical methods in probability and statistics (65C20) Stochastic ordinary differential equations (aspects of stochastic analysis) (60H10)
Related Items (15)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Adaptive sparse polynomial chaos expansion based on least angle regression
- Data-driven probability concentration and sampling on manifold
- Reduced basis approximation and a posteriori error estimation for affinely parametrized elliptic coercive partial differential equations. Application to transport and continuum mechanics.
- The multi-element probabilistic collocation method (ME-PCM): Error analysis and applications
- An adaptive hierarchical sparse grid collocation algorithm for the solution of stochastic differential equations
- The design and analysis of computer experiments.
- Riemannian geometry of Grassmann manifolds with a view on algorithmic computation
- Uncertainty quantification for complex systems with very high dimensional response using Grassmann manifold variations
- An adaptive local reduced basis method for solving PDEs with uncertain inputs and evaluating risk
- Modeling uncertainties in molecular dynamics simulations using a stochastic reduced-order basis
- Bayesian identification of a projection-based reduced order model for computational fluid dynamics
- PLS-based adaptation for efficient PCE representation in high dimensions
- Accelerated scale bridging with sparsely approximated Gaussian learning
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- An adaptive multi-element generalized polynomial chaos method for stochastic differential equations
- Diffusion maps
- Sparse On-Line Gaussian Processes
- A Survey of Projection-Based Model Reduction Methods for Parametric Dynamical Systems
- Schubert Varieties and Distances between Subspaces of Different Dimensions
- An Online Method for Interpolating Linear Parametric Reduced-Order Models
- Multi-Element Generalized Polynomial Chaos for Arbitrary Probability Measures
- Model Reduction for Large-Scale Systems with High-Dimensional Parametric Input Space
- Riemannian center of mass and mollifier smoothing
- The Geometry of Algorithms with Orthogonality Constraints
- Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- POLYNOMIAL-CHAOS-BASED KRIGING
- ASSESSING THE PERFORMANCE OF LEJA AND CLENSHAW-CURTIS COLLOCATION FOR COMPUTATIONAL ELECTROMAGNETICS WITH RANDOM INPUT DATA
- EXTENDING CLASSICAL SURROGATE MODELING TO HIGH DIMENSIONS THROUGH SUPERVISED DIMENSIONALITY REDUCTION: A DATA-DRIVEN APPROACH
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- High-Order Collocation Methods for Differential Equations with Random Inputs
- Dynamical Properties of Truncated Wiener-Hermite Expansions
- A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data
- Parallel three-dimensional simulations of quasi-static elastoplastic solids
This page was built for publication: Data-driven surrogates for high dimensional models using Gaussian process regression on the Grassmann manifold