Split Hamiltonian Monte Carlo
From MaRDI portal
Publication:892476
DOI10.1007/S11222-012-9373-1zbMATH Open1325.62018arXiv1106.5941OpenAlexW1979619742MaRDI QIDQ892476FDOQ892476
Authors: Babak Shahbaba, Shiwei Lan, W. O. Johnson, Radford Neal
Publication date: 19 November 2015
Published in: Statistics and Computing (Search for Journal in Brave)
Abstract: We show how the Hamiltonian Monte Carlo algorithm can sometimes be speeded up by "splitting" the Hamiltonian in a way that allows much of the movement around the state space to be done at low computational cost. One context where this is possible is when the log density of the distribution of interest (the potential energy function) can be written as the log of a Gaussian density, which is a quadratic function, plus a slowly varying function. Hamiltonian dynamics for quadratic energy functions can be analytically solved. With the splitting technique, only the slowly-varying part of the energy needs to be handled numerically, and this can be done with a larger stepsize (and hence fewer steps) than would be necessary with a direct simulation of the dynamics. Another context where splitting helps is when the most important terms of the potential energy function and its gradient can be evaluated quickly, with only a slowly-varying part requiring costly computations. With splitting, the quick portion can be handled with a small stepsize, while the costly portion uses a larger stepsize. We show that both of these splitting approaches can reduce the computational cost of sampling from the posterior distribution for a logistic regression model, using either a Gaussian approximation centered on the posterior mode, or a Hamiltonian split into a term that depends on only a small number of critical cases, and another term that involves the larger number of cases whose influence on the posterior distribution is small. Supplemental materials for this paper are available online.
Full work available at URL: https://arxiv.org/abs/1106.5941
Recommendations
- Split Hamiltonian Monte Carlo revisited
- Symmetrically processed splitting integrators for enhanced Hamiltonian Monte Carlo sampling
- MCMC using Hamiltonian dynamics
- Hamiltonian Monte Carlo with energy conserving subsampling
- Multi-stage splitting integrators for sampling with modified Hamiltonian Monte Carlo methods
Cites Work
Cited In (16)
- Stochastic approximation Hamiltonian Monte Carlo
- A New Optimality Property of Strang’s Splitting
- Adaptive parameters tuning based on energy-preserving splitting integration for Hamiltonian Monte Carlo method
- Hamiltonian Monte Carlo acceleration using surrogate functions with random bases
- Leave Pima Indians alone: binary regression as a benchmark for Bayesian computation
- Hamiltonian Monte Carlo based on evidence framework for Bayesian learning to neural network
- Bayesian Inference on Local Distributions of Functions and Multidimensional Curves with Spherical HMC Sampling
- Sampling Constrained Probability Distributions Using Spherical Augmentation
- Title not available (Why is that?)
- Recycling intermediate steps to improve Hamiltonian Monte Carlo
- An Efficient Coalescent Model for Heterochronously Sampled Molecular Data
- Precomputing strategy for Hamiltonian Monte Carlo method based on regularity in parameter space
- Geodesic Lagrangian Monte Carlo over the space of positive definite matrices: with application to Bayesian spectral density estimation
- An algorithm for distributed Bayesian inference
- Split Hamiltonian Monte Carlo revisited
- Distributed Bayesian Inference in Linear Mixed-Effects Models
Uses Software
This page was built for publication: Split Hamiltonian Monte Carlo
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q892476)