Stable and efficient Gaussian process calculations
zbMATH Open1235.62126MaRDI QIDQ2880909FDOQ2880909
Authors: Leslie Foster, Alex Waagen, Nabeela Aijaz, Michael Hurley, Apolonio Luis, Joel Rinsky, Chandrika Satyavolu, Michael J. Way, Paul Gazis, A. N. Srivastava
Publication date: 17 April 2012
Published in: Journal of Machine Learning Research (JMLR) (Search for Journal in Brave)
Full work available at URL: http://www.jmlr.org/papers/v10/foster09a.html
Recommendations
Learning and adaptive systems in artificial intelligence (68T05) Inference from stochastic processes and prediction (62M20) Statistical astronomy (85A35)
Cited In (12)
- A scalable gaussian process analysis algorithm for biomass monitoring
- Efficient reduced-rank methods for Gaussian processes with eigenfunction expansions
- Sparse inverse kernel Gaussian Process regression
- Consistent online Gaussian process regression without the sample complexity bottleneck
- Large scale variable fidelity surrogate modeling
- Variable Selection for Gaussian Process Models using Experimental Design-Based Subagging
- Convergence of sparse variational inference in Gaussian processes regression
- Adaptive joint distribution learning
- Error-controlled model approximation for Gaussian process morphable models
- Efficient Gaussian process regression for large datasets
- On maximum volume submatrices and cross approximation for symmetric semidefinite and diagonally dominant matrices
- Efficient approximation of random fields for numerical applications.
Uses Software
This page was built for publication: Stable and efficient Gaussian process calculations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2880909)