Data-driven surrogates for high dimensional models using Gaussian process regression on the Grassmann manifold
From MaRDI portal
Publication:2020284
Abstract: This paper introduces a surrogate modeling scheme based on Grassmannian manifold learning to be used for cost-efficient predictions of high-dimensional stochastic systems. The method exploits subspace-structured features of each solution by projecting it onto a Grassmann manifold. The method utilizes a solution clustering approach in order to identify regions of the parameter space over which solutions are sufficiently similarly such that they can be interpolated on the Grassmannian. In this clustering, the reduced-order solutions are partitioned into disjoint clusters on the Grassmann manifold using the eigen-structure of properly defined Grassmannian kernels and, the Karcher mean of each cluster is estimated. Then, the points in each cluster are projected onto the tangent space with origin at the corresponding Karcher mean using the exponential mapping. For each cluster, a Gaussian process regression model is trained that maps the input parameters of the system to the reduced solution points of the corresponding cluster projected onto the tangent space. Using this Gaussian process model, the full-field solution can be efficiently predicted at any new point in the parameter space. In certain cases, the solution clusters will span disjoint regions of the parameter space. In such cases, for each of the solution clusters we utilize a second, density-based spatial clustering to group their corresponding input parameter points in the Euclidean space. The proposed method is applied to two numerical examples. The first is a nonlinear stochastic ordinary differential equation with uncertain initial conditions. The second involves modeling of plastic deformation in a model amorphous solid using the Shear Transformation Zone theory of plasticity.
Recommendations
- Conditional deep surrogate models for stochastic, high-dimensional, and multi-fidelity systems
- Surrogate modeling of high-dimensional problems via data-driven polynomial chaos expansions and sparse partial least square
- Nonlinear dimension reduction for surrogate modeling using gradient information
- ANOVA Gaussian process modeling for high-dimensional stochastic computational models
- Projection-based model reduction of dynamical systems using space-time subspace and machine learning
Cites work
- scientific article; zbMATH DE number 1054729 (Why is no real title available?)
- scientific article; zbMATH DE number 867649 (Why is no real title available?)
- A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data
- A survey of projection-based model reduction methods for parametric dynamical systems
- ASSESSING THE PERFORMANCE OF LEJA AND CLENSHAW-CURTIS COLLOCATION FOR COMPUTATIONAL ELECTROMAGNETICS WITH RANDOM INPUT DATA
- Accelerated scale bridging with sparsely approximated Gaussian learning
- Adaptive sparse polynomial chaos expansion based on least angle regression
- An Online Method for Interpolating Linear Parametric Reduced-Order Models
- An adaptive hierarchical sparse grid collocation algorithm for the solution of stochastic differential equations
- An adaptive local reduced basis method for solving PDEs with uncertain inputs and evaluating risk
- An adaptive multi-element generalized polynomial chaos method for stochastic differential equations
- Bayesian identification of a projection-based reduced order model for computational fluid dynamics
- Data-driven probability concentration and sampling on manifold
- Diffusion maps
- Dynamical Properties of Truncated Wiener-Hermite Expansions
- EXTENDING CLASSICAL SURROGATE MODELING TO HIGH DIMENSIONS THROUGH SUPERVISED DIMENSIONALITY REDUCTION: A DATA-DRIVEN APPROACH
- Gaussian processes for machine learning.
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- High-Order Collocation Methods for Differential Equations with Random Inputs
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Model Reduction for Large-Scale Systems with High-Dimensional Parametric Input Space
- Modeling uncertainties in molecular dynamics simulations using a stochastic reduced-order basis
- Multi-Element Generalized Polynomial Chaos for Arbitrary Probability Measures
- PLS-based adaptation for efficient PCE representation in high dimensions
- POLYNOMIAL-CHAOS-BASED KRIGING
- Parallel three-dimensional simulations of quasi-static elastoplastic solids
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
- Reduced basis approximation and a posteriori error estimation for affinely parametrized elliptic coercive partial differential equations. Application to transport and continuum mechanics.
- Riemannian center of mass and mollifier smoothing
- Riemannian geometry of Grassmann manifolds with a view on algorithmic computation
- Schubert varieties and distances between subspaces of different dimensions
- Sparse on-line Gaussian processes
- The Geometry of Algorithms with Orthogonality Constraints
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- The design and analysis of computer experiments.
- The multi-element probabilistic collocation method (ME-PCM): Error analysis and applications
- Uncertainty quantification for complex systems with very high dimensional response using Grassmann manifold variations
Cited in
(25)- AI in computational mechanics and engineering sciences
- Fusing nonlinear solvers with transformers for accelerating the solution of parametric transient problems
- A data-driven surrogate modeling approach for time-dependent incompressible Navier-Stokes equations with dynamic mode decomposition and manifold interpolation
- Rates of the strong uniform consistency with rates for conditional \(U\)-statistics estimators with general kernels on manifolds
- Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints
- Probabilistic learning on manifolds constrained by nonlinear partial differential equations for small datasets
- Probabilistic model updating via variational Bayesian inference and adaptive Gaussian process modeling
- Conditional deep surrogate models for stochastic, high-dimensional, and multi-fidelity systems
- Uncertainty quantification for complex systems with very high dimensional response using Grassmann manifold variations
- Grassmannian diffusion maps based surrogate modeling via geometric harmonics
- Data-driven uncertainty quantification in computational human head models
- Polynomial-chaos-based conditional statistics for probabilistic learning with heterogeneous data applied to atomic collisions of helium on graphite substrate
- Active learning with multifidelity modeling for efficient rare event simulation
- Grassmannian diffusion maps-based dimension reduction and classification for high-dimensional data
- Polynomial chaos expansions on principal geodesic Grassmannian submanifolds for surrogate modeling and uncertainty quantification
- Accurate data-driven surrogates of dynamical systems for forward propagation of uncertainty
- Projection pursuit adaptation on polynomial chaos expansions
- A survey of unsupervised learning methods for high-dimensional uncertainty quantification in black-box-type problems
- Handling noise and overfitting in surrogate models based on non-uniform rational basis spline entities
- Latent map Gaussian processes for mixed variable metamodeling
- On the influence of over-parameterization in manifold based surrogates and deep neural operators
- scientific article; zbMATH DE number 6975079 (Why is no real title available?)
- Data-driven projection pursuit adaptation of polynomial chaos expansions for dependent high-dimensional parameters
- Probabilistic-learning-based stochastic surrogate model from small incomplete datasets for nonlinear dynamical systems
- Rates of the strong uniform consistency for the kernel-type regression function estimators with general kernels on manifolds
This page was built for publication: Data-driven surrogates for high dimensional models using Gaussian process regression on the Grassmann manifold
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2020284)