Hilbert space methods for reduced-rank Gaussian process regression
From MaRDI portal
Abstract: This paper proposes a novel scheme for reduced-rank Gaussian process regression. The method is based on an approximate series expansion of the covariance function in terms of an eigenfunction expansion of the Laplace operator in a compact subset of . On this approximate eigenbasis the eigenvalues of the covariance function can be expressed as simple functions of the spectral density of the Gaussian process, which allows the GP inference to be solved under a computational cost scaling as (initial) and (hyperparameter learning) with basis functions and data points. Furthermore, the basis functions are independent of the parameters of the covariance function, which allows for very fast hyperparameter learning. The approach also allows for rigorous error analysis with Hilbert space theory, and we show that the approximation becomes exact when the size of the compact subset and the number of eigenfunctions go to infinity. We also show that the convergence rate of the truncation error is independent of the input dimensionality provided that the differentiability order of the covariance function is increases appropriately, and for the squared exponential covariance function it is always bounded by regardless of the input dimensionality. The expansion generalizes to Hilbert spaces with an inner product which is defined as an integral over a specified input density. The method is compared to previously proposed methods theoretically and through empirical tests with simulated and real data.
Recommendations
- Switching and Learning in Feedback Systems
- Efficient reduced-rank methods for Gaussian processes with eigenfunction expansions
- Gaussian Process Subspace Prediction for Model Reduction
- Practical Hilbert space approximate Bayesian Gaussian processes for probabilistic programming
- Gaussian Process Regression on Nested Spaces
- Reduced rank ridge regression and its kernel extensions
- Gaussian processes with built-in dimensionality reduction: applications to high-dimensional uncertainty propagation
- Dimension reduction via Gaussian ridge functions
- A unifying view of sparse approximate Gaussian process regression
Cites work
- scientific article; zbMATH DE number 3176450 (Why is no real title available?)
- scientific article; zbMATH DE number 3751955 (Why is no real title available?)
- scientific article; zbMATH DE number 45848 (Why is no real title available?)
- scientific article; zbMATH DE number 48198 (Why is no real title available?)
- scientific article; zbMATH DE number 3581570 (Why is no real title available?)
- scientific article; zbMATH DE number 3633568 (Why is no real title available?)
- scientific article; zbMATH DE number 3998594 (Why is no real title available?)
- scientific article; zbMATH DE number 876688 (Why is no real title available?)
- scientific article; zbMATH DE number 6276239 (Why is no real title available?)
- scientific article; zbMATH DE number 3249395 (Why is no real title available?)
- scientific article; zbMATH DE number 3320868 (Why is no real title available?)
- scientific article; zbMATH DE number 961607 (Why is no real title available?)
- A Correspondence Between Bayesian Estimation on Stochastic Processes and Smoothing by Splines
- A unifying framework for Gaussian process pseudo-point approximations using power expectation propagation
- A unifying view of sparse approximate Gaussian process regression
- An explicit link between Gaussian fields and Gaussian Markov random fields: the stochastic partial differential equation approach
- Application of FFT-based algorithms for large-scale universal kriging problems
- GPstuff: Bayesian modeling with Gaussian processes
- Gaussian processes for machine learning.
- Handbook of Markov Chain Monte Carlo
- Improved matrix algorithms via the subsampled randomized Hadamard transform
- Learning Curves for Gaussian Process Regression: Approximations and Bounds
- MCMC using Hamiltonian dynamics
- Monte Carlo strategies in scientific computing
- On the low-rank approximation by the pivoted Cholesky decomposition
- Sparse on-line Gaussian processes
- Sparse spectrum Gaussian process regression
- Statistical and computational inverse problems.
- Stochastic Equations in Infinite Dimensions
- String and membrane Gaussian processes
- Switching and Learning in Feedback Systems
- Towards a practicable Bayesian nonparametric density estimator
- Variational Fourier features for Gaussian processes
Cited in
(29)- A data-driven method for parametric PDE eigenvalue problems using Gaussian process with different covariance functions
- Finite element representations of Gaussian processes: balancing numerical and statistical accuracy
- Fast generation of Gaussian random fields for direct numerical simulations of stochastic transport
- Consistent online Gaussian process regression without the sample complexity bottleneck
- Large-scale local surrogate modeling of stochastic simulation experiments
- Efficient reduced-rank methods for Gaussian processes with eigenfunction expansions
- Switching and Learning in Feedback Systems
- The reliability factor: modeling individual reliability with multiple items from a single assessment
- Convergence of sparse variational inference in Gaussian processes regression
- Detecting and diagnosing prior and likelihood sensitivity with power-scaling
- A localized ensemble of approximate Gaussian processes for fast sequential emulation
- Non-stationary multi-layered Gaussian priors for Bayesian inversion
- Short communication: projection of functionals and fast pricing of exotic options
- Orthonormal expansions for translation-invariant kernels
- Variational inference at glacier scale
- scientific article; zbMATH DE number 7370622 (Why is no real title available?)
- Estimating the effects of a California gun control program with multitask Gaussian processes
- Practical Hilbert space approximate Bayesian Gaussian processes for probabilistic programming
- Low-rank statistical finite elements for scalable model-data synthesis
- Online Bayesian inference and learning of Gaussian-process state-space models
- Conjugate gradients for kernel machines
- Scalable computations for nonstationary Gaussian processes
- Probabilistic approach to limited-data computed tomography reconstruction
- rts2
- Approximate leave-future-out cross-validation for Bayesian time series models
- Gaussian Process Subspace Prediction for Model Reduction
- Gaussian process manifold interpolation for probabilistic atrial activation maps and uncertain conduction velocity
- Stochastic PDE representation of random fields for large-scale Gaussian process regression and statistical finite element analysis
- Locally induced Gaussian processes for large-scale simulation experiments
This page was built for publication: Hilbert space methods for reduced-rank Gaussian process regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q91877)