Vecchia-approximated Deep Gaussian Processes for Computer Experiments

From MaRDI portal
Publication:84900

DOI10.1080/10618600.2022.2129662arXiv2204.02904OpenAlexW4300537828MaRDI QIDQ84900FDOQ84900


Authors: Annie Sauer, Andrew Cooper, Robert B. Gramacy Edit this on Wikidata


Publication date: 6 April 2022

Published in: Journal of Computational and Graphical Statistics (Search for Journal in Brave)

Abstract: Deep Gaussian processes (DGPs) upgrade ordinary GPs through functional composition, in which intermediate GP layers warp the original inputs, providing flexibility to model non-stationary dynamics. Two DGP regimes have emerged in recent literature. A "big data" regime, prevalent in machine learning, favors approximate, optimization-based inference for fast, high-fidelity prediction. A "small data" regime, preferred for computer surrogate modeling, deploys posterior integration for enhanced uncertainty quantification (UQ). We aim to bridge this gap by expanding the capabilities of Bayesian DGP posterior inference through the incorporation of the Vecchia approximation, allowing linear computational scaling without compromising accuracy or UQ. We are motivated by surrogate modeling of simulation campaigns with upwards of 100,000 runs - a size too large for previous fully-Bayesian implementations - and demonstrate prediction and UQ superior to that of "big data" competitors. All methods are implemented in the "deepgp" package on CRAN.


Full work available at URL: https://arxiv.org/abs/2204.02904







Cites Work


Cited In (2)





This page was built for publication: Vecchia-approximated Deep Gaussian Processes for Computer Experiments

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q84900)