Locally induced Gaussian processes for large-scale simulation experiments
From MaRDI portal
Publication:2058747
Abstract: Gaussian processes (GPs) serve as flexible surrogates for complex surfaces, but buckle under the cubic cost of matrix decompositions with big training data sizes. Geospatial and machine learning communities suggest pseudo-inputs, or inducing points, as one strategy to obtain an approximation easing that computational burden. However, we show how placement of inducing points and their multitude can be thwarted by pathologies, especially in large-scale dynamic response surface modeling tasks. As remedy, we suggest porting the inducing point idea, which is usually applied globally, over to a more local context where selection is both easier and faster. In this way, our proposed methodology hybridizes global inducing point and data subset-based local GP approximation. A cascade of strategies for planning the selection of local inducing points is provided, and comparisons are drawn to related methodology with emphasis on computer surrogate modeling applications. We show that local inducing points extend their global and data-subset component parts on the accuracy--computational efficiency frontier. Illustrative examples are provided on benchmark data and a large-scale real-simulation satellite drag interpolation problem.
Recommendations
- Emulating satellite drag from large simulation experiments
- Massively parallel approximate Gaussian process regression
- Scaled Vecchia approximation for fast computer-model emulation
- Exploiting Variance Reduction Potential in Local Gaussian Process Search
- Large scale variable fidelity surrogate modeling
Cites work
- scientific article; zbMATH DE number 6377992 (Why is no real title available?)
- scientific article; zbMATH DE number 5964910 (Why is no real title available?)
- scientific article; zbMATH DE number 5281111 (Why is no real title available?)
- scientific article; zbMATH DE number 45848 (Why is no real title available?)
- scientific article; zbMATH DE number 1522714 (Why is no real title available?)
- scientific article; zbMATH DE number 3799842 (Why is no real title available?)
- scientific article; zbMATH DE number 823069 (Why is no real title available?)
- A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output from a Computer Code
- A Limited Memory Algorithm for Bound Constrained Optimization
- A general framework for Vecchia approximations of Gaussian processes
- A supermartingale approach to Gaussian process based sequential design of experiments
- A unifying view of sparse approximate Gaussian process regression
- Adaptive Gaussian Process Approximation for Bayesian Inference with Expensive Likelihood Functions
- Analyzing Nonstationary Spatial Data Using Piecewise Gaussian Processes
- Approximating Likelihoods for Large Spatial Data Sets
- Bayesian treed Gaussian process models with an application to computer modeling
- Computer experiment designs for accurate prediction
- Design and analysis of computer experiments. With comments and a rejoinder by the authors
- Efficient emulators of computer experiments using compactly supported correlation functions, with an application to cosmology
- Emulating satellite drag from large simulation experiments
- Exploiting Variance Reduction Potential in Local Gaussian Process Search
- Exploratory designs for computational experiments
- Fast estimation of \(\mathrm{tr}(f(A))\) via stochastic Lanczos quadrature
- Gaussian Predictive Process Models for Large Spatial Data Sets
- Gaussian processes for machine learning.
- Hilbert space methods for reduced-rank Gaussian process regression
- Knot selection in sparse Gaussian processes with a variational objective function
- Massively parallel approximate Gaussian process regression
- Mercer kernels and integrated variance experimental design: connections between Gaussian process regression and polynomial approximation
- Parameter estimation in high dimensional Gaussian distributions
- Quantifying uncertainties on excursion sets under a Gaussian random field prior
- Recursive estimation for sparse Gaussian process regression
- Regularization algorithms for learning that are equivalent to multilayer networks
- Sparse on-line Gaussian processes
- Spectral approximation of the IMSE criterion for optimal designs in kernel-based interpolation models
- Stochastic kriging for simulation metamodeling
- The design and analysis of computer experiments
- Variational inference for sparse spectrum Gaussian process regression
Cited in
(14)- Emulating satellite drag from large simulation experiments
- Scaled Vecchia approximation for fast computer-model emulation
- Active Learning for Deep Gaussian Process Surrogates
- Real-Time Local GP Model Learning
- Massively parallel approximate Gaussian process regression
- Computationally efficient algorithm for Gaussian process regression in case of structured samples
- Batch-sequential design and heteroskedastic surrogate modeling for delta smelt conservation
- Large-scale local surrogate modeling of stochastic simulation experiments
- A Global-Local Approximation Framework for Large-Scale Gaussian Process Modeling
- Sensitivity Prewarping for Local Surrogate Modeling
- Augmenting a Simulation Campaign for Hybrid Computer Model and Field Data Experiments
- Large scale variable fidelity surrogate modeling
- Vecchia-approximated Deep Gaussian Processes for Computer Experiments
- Exploiting Variance Reduction Potential in Local Gaussian Process Search
This page was built for publication: Locally induced Gaussian processes for large-scale simulation experiments
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2058747)