Fully Bayesian Inference for Latent Variable Gaussian Process Models
From MaRDI portal
Publication:6188699
DOI10.1137/22M1525600arXiv2211.02218MaRDI QIDQ6188699FDOQ6188699
Authors: Akshay Ganesh Iyer, Wei Chen, Daniel W. Apley
Publication date: 11 January 2024
Published in: SIAM/ASA Journal on Uncertainty Quantification (Search for Journal in Brave)
Abstract: Real engineering and scientific applications often involve one or more qualitative inputs. Standard Gaussian processes (GPs), however, cannot directly accommodate qualitative inputs. The recently introduced latent variable Gaussian process (LVGP) overcomes this issue by first mapping each qualitative factor to underlying latent variables (LVs), and then uses any standard GP covariance function over these LVs. The LVs are estimated similarly to the other GP hyperparameters through maximum likelihood estimation, and then plugged into the prediction expressions. However, this plug-in approach will not account for uncertainty in estimation of the LVs, which can be significant especially with limited training data. In this work, we develop a fully Bayesian approach for the LVGP model and for visualizing the effects of the qualitative inputs via their LVs. We also develop approximations for scaling up LVGPs and fully Bayesian inference for the LVGP hyperparameters. We conduct numerical studies comparing plug-in inference against fully Bayesian inference over a few engineering models and material design applications. In contrast to previous studies on standard GP modeling that have largely concluded that a fully Bayesian treatment offers limited improvements, our results show that for LVGP modeling it offers significant improvements in prediction accuracy and uncertainty quantification over the plug-in approach.
Full work available at URL: https://arxiv.org/abs/2211.02218
Recommendations
- Latent variable Gaussian process models: a rank-based analysis and an alternative approach
- Latent map Gaussian processes for mixed variable metamodeling
- Variational inference for latent variables and uncertain inputs in Gaussian processes
- Pseudo-marginal Bayesian inference for Gaussian process latent variable models
- Generic inference in latent Gaussian process models
Gaussian processlatent variablesuncertainty quantificationcategorical variablesfully Bayesian inference
Cites Work
- The no-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
- Strictly Proper Scoring Rules, Prediction, and Estimation
- Analysis methods for computer experiments: how to assess and what counts?
- Bayesian Design and Analysis of Computer Experiments: Use of Derivatives in Surface Prediction
- A unifying view of sparse approximate Gaussian process regression
- Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates
- Assessment of uncertainty in computer experiments from universal to Bayesian kriging
- Group kernels for Gaussian process metamodels with categorical inputs
- Bayesian optimization of variable-size design space problems
- Flexible Correlation Structure for Accurate Prediction and Uncertainty Quantification in Bayesian Gaussian Process Emulation of a Computer Model
Cited In (1)
This page was built for publication: Fully Bayesian Inference for Latent Variable Gaussian Process Models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6188699)