VAE: a stochastic process prior for Bayesian deep learning with MCMC
From MaRDI portal
Publication:2103969
Abstract: Stochastic processes provide a mathematically elegant way model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. In practice, however, efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder (VAE). The VAE is finitely exchangeable and Kolmogorov consistent, and thus is a continuous stochastic process. We use VAE to learn low dimensional embeddings of function classes. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions to enable statistical inference (such as the integral of a log Gaussian process). For popular tasks, such as spatial interpolation, VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate that the low dimensional independently distributed latent space representation learnt provides an elegant and scalable means of performing Bayesian inference for stochastic processes within probabilistic programming languages such as Stan.
Recommendations
Cites work
- scientific article; zbMATH DE number 6377992 (Why is no real title available?)
- scientific article; zbMATH DE number 4074523 (Why is no real title available?)
- scientific article; zbMATH DE number 1093829 (Why is no real title available?)
- scientific article; zbMATH DE number 3046994 (Why is no real title available?)
- An Introduction to Variational Autoencoders
- An explicit link between Gaussian fields and Gaussian Markov random fields: the stochastic partial differential equation approach
- Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations (with discussion)
- Bayesian learning for neural networks
- Gaussian processes for machine learning.
- Log Gaussian Cox Processes
- Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems
- On the geometric ergodicity of Hamiltonian Monte Carlo
- Reducing the Dimensionality of Data with Neural Networks
- Spectra of some self-exciting and mutually exciting point processes
- Stochastic processes and applications. Diffusion processes, the Fokker-Planck and Langevin equations
This page was built for publication: \(\pi\) VAE: a stochastic process prior for Bayesian deep learning with MCMC
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2103969)