Compositionally-warped Gaussian processes
From MaRDI portal
Abstract: The Gaussian process (GP) is a nonparametric prior distribution over functions indexed by time, space, or other high-dimensional index set. The GP is a flexible model yet its limitation is given by its very nature: it can only model Gaussian marginal distributions. To model non-Gaussian data, a GP can be warped by a nonlinear transformation (or warping) as performed by warped GPs (WGPs) and more computationally-demanding alternatives such as Bayesian WGPs and deep GPs. However, the WGP requires a numerical approximation of the inverse warping for prediction, which increases the computational complexity in practice. To sidestep this issue, we construct a novel class of warpings consisting of compositions of multiple elementary functions, for which the inverse is known explicitly. We then propose the compositionally-warped GP (CWGP), a non-Gaussian generative model whose expressiveness follows from its deep compositional architecture, and its computational efficiency is guaranteed by the analytical inverse warping. Experimental validation using synthetic and real-world datasets confirms that the proposed CWGP is robust to the choice of warpings and provides more accurate point predictions, better trained models and shorter computation times than WGP.
Recommendations
Cites work
- scientific article; zbMATH DE number 3716479 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- scientific article; zbMATH DE number 3273551 (Why is no real title available?)
- A Family of Nonparametric Density Estimation Algorithms
- An Analysis of Transformations Revisited
- An efficient method for finding the minimum of a function of several variables without calculating derivatives
- An introduction to measure theory
- Bayesian learning for neural networks
- Computationally efficient convolved multiple output Gaussian processes
- Deep learning
- Density estimation by dual ascent of the log-likelihood
- Fundamentals of stochastic filtering
- Gaussian processes for machine learning.
- Inverse Box\,-\,Cox: the power-normal distribution
- Learning deep architectures for AI
- Modular representation of layered neural networks
- SYSTEMS OF FREQUENCY CURVES GENERATED BY METHODS OF TRANSLATION
- Sinh-arcsinh distributions
- Stability of a 4th-order curvature condition arising in optimal transport theory
Cited in
(5)- Binary spatial random field reconstruction from non-Gaussian inhomogeneous time-series observations
- Gaussian fuzzy theoretic analysis for variational learning of nested compositions
- Warped Gaussian processes and derivative-based sequential designs for functions with heterogeneous variations
- HEBO: An Empirical Study of Assumptions in Bayesian Optimisation
- Deep Compositional Spatial Models
This page was built for publication: Compositionally-warped Gaussian processes
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2185629)