Split Hamiltonian Monte Carlo
From MaRDI portal
Publication:892476
Abstract: We show how the Hamiltonian Monte Carlo algorithm can sometimes be speeded up by "splitting" the Hamiltonian in a way that allows much of the movement around the state space to be done at low computational cost. One context where this is possible is when the log density of the distribution of interest (the potential energy function) can be written as the log of a Gaussian density, which is a quadratic function, plus a slowly varying function. Hamiltonian dynamics for quadratic energy functions can be analytically solved. With the splitting technique, only the slowly-varying part of the energy needs to be handled numerically, and this can be done with a larger stepsize (and hence fewer steps) than would be necessary with a direct simulation of the dynamics. Another context where splitting helps is when the most important terms of the potential energy function and its gradient can be evaluated quickly, with only a slowly-varying part requiring costly computations. With splitting, the quick portion can be handled with a small stepsize, while the costly portion uses a larger stepsize. We show that both of these splitting approaches can reduce the computational cost of sampling from the posterior distribution for a logistic regression model, using either a Gaussian approximation centered on the posterior mode, or a Hamiltonian split into a term that depends on only a small number of critical cases, and another term that involves the larger number of cases whose influence on the posterior distribution is small. Supplemental materials for this paper are available online.
Recommendations
- Split Hamiltonian Monte Carlo revisited
- Symmetrically processed splitting integrators for enhanced Hamiltonian Monte Carlo sampling
- MCMC using Hamiltonian dynamics
- Hamiltonian Monte Carlo with energy conserving subsampling
- Multi-stage splitting integrators for sampling with modified Hamiltonian Monte Carlo methods
Cites work
- scientific article; zbMATH DE number 1825523 (Why is no real title available?)
- Accurate Approximations for Posterior Moments and Marginal Densities
- Hybrid Monte Carlo on Hilbert spaces
- Practical Markov Chain Monte Carlo
- Simulating Hamiltonian Dynamics
- The no-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
Cited in
(16)- Geodesic Lagrangian Monte Carlo over the space of positive definite matrices: with application to Bayesian spectral density estimation
- Adaptive parameters tuning based on energy-preserving splitting integration for Hamiltonian Monte Carlo method
- Stochastic approximation Hamiltonian Monte Carlo
- Sampling constrained probability distributions using spherical augmentation
- Split Hamiltonian Monte Carlo revisited
- Recycling intermediate steps to improve Hamiltonian Monte Carlo
- An Efficient Coalescent Model for Heterochronously Sampled Molecular Data
- Scalable Bayes via barycenter in Wasserstein space
- Hamiltonian Monte Carlo based on evidence framework for Bayesian learning to neural network
- Leave Pima Indians alone: binary regression as a benchmark for Bayesian computation
- Distributed Bayesian Inference in Linear Mixed-Effects Models
- A New Optimality Property of Strang’s Splitting
- An algorithm for distributed Bayesian inference
- Hamiltonian Monte Carlo acceleration using surrogate functions with random bases
- Precomputing strategy for Hamiltonian Monte Carlo method based on regularity in parameter space
- Bayesian Inference on Local Distributions of Functions and Multidimensional Curves with Spherical HMC Sampling
This page was built for publication: Split Hamiltonian Monte Carlo
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q892476)