Data-driven surrogates for high dimensional models using Gaussian process regression on the Grassmann manifold
From MaRDI portal
Publication:2020284
DOI10.1016/j.cma.2020.113269zbMath1506.62549arXiv2003.11910OpenAlexW3013716343MaRDI QIDQ2020284
Publication date: 23 April 2021
Published in: Computer Methods in Applied Mechanics and Engineering (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2003.11910
interpolationGrassmann manifoldmachine learningspectral clusteringGaussian process regressionnonlinear projection
Statistics on manifolds (62R30) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Probabilistic models, generic numerical methods in probability and statistics (65C20) Stochastic ordinary differential equations (aspects of stochastic analysis) (60H10)
Related Items
Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints ⋮ Grassmannian Diffusion Maps--Based Dimension Reduction and Classification for High-Dimensional Data ⋮ Data-driven uncertainty quantification in computational human head models ⋮ Active learning with multifidelity modeling for efficient rare event simulation ⋮ A survey of unsupervised learning methods for high-dimensional uncertainty quantification in black-box-type problems ⋮ A data-driven surrogate modeling approach for time-dependent incompressible Navier-Stokes equations with dynamic mode decomposition and manifold interpolation ⋮ Rates of the strong uniform consistency for the kernel-type regression function estimators with general kernels on manifolds ⋮ Grassmannian diffusion maps based surrogate modeling via geometric harmonics ⋮ Projection pursuit adaptation on polynomial chaos expansions ⋮ On the influence of over-parameterization in manifold based surrogates and deep neural operators ⋮ Polynomial-chaos-based conditional statistics for probabilistic learning with heterogeneous data applied to atomic collisions of helium on graphite substrate ⋮ AI in computational mechanics and engineering sciences ⋮ Probabilistic learning on manifolds constrained by nonlinear partial differential equations for small datasets ⋮ Probabilistic model updating via variational Bayesian inference and adaptive Gaussian process modeling ⋮ Latent map Gaussian processes for mixed variable metamodeling
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Adaptive sparse polynomial chaos expansion based on least angle regression
- Data-driven probability concentration and sampling on manifold
- Reduced basis approximation and a posteriori error estimation for affinely parametrized elliptic coercive partial differential equations. Application to transport and continuum mechanics.
- The multi-element probabilistic collocation method (ME-PCM): Error analysis and applications
- An adaptive hierarchical sparse grid collocation algorithm for the solution of stochastic differential equations
- The design and analysis of computer experiments.
- Riemannian geometry of Grassmann manifolds with a view on algorithmic computation
- Uncertainty quantification for complex systems with very high dimensional response using Grassmann manifold variations
- An adaptive local reduced basis method for solving PDEs with uncertain inputs and evaluating risk
- Modeling uncertainties in molecular dynamics simulations using a stochastic reduced-order basis
- Bayesian identification of a projection-based reduced order model for computational fluid dynamics
- PLS-based adaptation for efficient PCE representation in high dimensions
- Accelerated scale bridging with sparsely approximated Gaussian learning
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- An adaptive multi-element generalized polynomial chaos method for stochastic differential equations
- Diffusion maps
- Sparse On-Line Gaussian Processes
- A Survey of Projection-Based Model Reduction Methods for Parametric Dynamical Systems
- Schubert Varieties and Distances between Subspaces of Different Dimensions
- An Online Method for Interpolating Linear Parametric Reduced-Order Models
- Multi-Element Generalized Polynomial Chaos for Arbitrary Probability Measures
- Model Reduction for Large-Scale Systems with High-Dimensional Parametric Input Space
- Riemannian center of mass and mollifier smoothing
- The Geometry of Algorithms with Orthogonality Constraints
- Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- POLYNOMIAL-CHAOS-BASED KRIGING
- ASSESSING THE PERFORMANCE OF LEJA AND CLENSHAW-CURTIS COLLOCATION FOR COMPUTATIONAL ELECTROMAGNETICS WITH RANDOM INPUT DATA
- EXTENDING CLASSICAL SURROGATE MODELING TO HIGH DIMENSIONS THROUGH SUPERVISED DIMENSIONALITY REDUCTION: A DATA-DRIVEN APPROACH
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- High-Order Collocation Methods for Differential Equations with Random Inputs
- Dynamical Properties of Truncated Wiener-Hermite Expansions
- A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data
- Parallel three-dimensional simulations of quasi-static elastoplastic solids