Stable and efficient Gaussian process calculations
zbMATH Open1235.62126MaRDI QIDQ2880909FDOQ2880909
Apolonio Luis, Leslie Foster, Michael Hurley, Joel Rinsky, Chandrika Satyavolu, Paul Gazis, A. N. Srivastava, Alex Waagen, Nabeela Aijaz, Michael J. Way
Publication date: 17 April 2012
Published in: Journal of Machine Learning Research (JMLR) (Search for Journal in Brave)
Full work available at URL: http://www.jmlr.org/papers/v10/foster09a.html
Recommendations
Learning and adaptive systems in artificial intelligence (68T05) Inference from stochastic processes and prediction (62M20) Statistical astronomy (85A35)
Cited In (11)
- A scalable gaussian process analysis algorithm for biomass monitoring
- Efficient reduced-rank methods for Gaussian processes with eigenfunction expansions
- Efficient approximation of random fields for numerical applications
- Sparse inverse kernel Gaussian Process regression
- Title not available (Why is that?)
- Consistent online Gaussian process regression without the sample complexity bottleneck
- Large scale variable fidelity surrogate modeling
- Variable Selection for Gaussian Process Models using Experimental Design-Based Subagging
- Adaptive joint distribution learning
- Error-controlled model approximation for Gaussian process morphable models
- On maximum volume submatrices and cross approximation for symmetric semidefinite and diagonally dominant matrices
Uses Software
This page was built for publication: Stable and efficient Gaussian process calculations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2880909)