Optimal tuning of the hybrid Monte Carlo algorithm

From MaRDI portal
Revision as of 22:20, 2 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2435211

DOI10.3150/12-BEJ414zbMath1287.60090arXiv1001.4460OpenAlexW1981514681MaRDI QIDQ2435211

Alexandros Beskos, Andrew M. Stuart, Natesh S. Pillai, Gareth O. Roberts, Jesús María Sanz-Serna

Publication date: 4 February 2014

Published in: Bernoulli (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1001.4460




Related Items (68)

HMC: reducing the number of rejections by not using leapfrog and some results on the acceptance rateOptimal scaling of random walk Metropolis algorithms using Bayesian large-sample asymptoticsMerging MCMC subposteriors through Gaussian-process approximationsDesigning simple and efficient Markov chain Monte Carlo proposal kernelsAn adaptive multiple-try Metropolis algorithmUnnamed ItemForward Event-Chain Monte Carlo: Fast Sampling by Randomness Control in Irreversible Markov ChainsFEM-based discretization-invariant MCMC methods for PDE-constrained Bayesian inverse problemsMultiple-time-stepping generalized hybrid Monte Carlo methodsHierarchical models: local proposal variances for RWM-within-Gibbs and MALA-within-GibbsOn the application of improved symplectic integrators in Hamiltonian Monte CarloUnnamed ItemAdaptive multi-stage integrators for optimal energy conservation in molecular simulationsCopula multivariate GARCH model with constrained Hamiltonian Monte CarloRejoinder on: ``Some recent work on multivariate Gaussian Markov random fieldsThread-Parallel Anisotropic Mesh AdaptationOptimal scaling and diffusion limits for the Langevin algorithm in high dimensionsFinite element model updating using Hamiltonian Monte Carlo techniquesBayesian elastic net based on empirical likelihoodAdaptive multiple importance sampling for Gaussian processesBayesian computation for Log-Gaussian Cox processes: a comparative analysis of methodsPalindromic 3-stage splitting integrators, a roadmapNon-reversible guided Metropolis kernelSystem identification using autoregressive Bayesian neural networks with nonparametric noise modelsConstrained Hamiltonian Monte Carlo in BEKK GARCH with targetingOptimal scaling of random-walk Metropolis algorithms on general target distributionsConvergence of unadjusted Hamiltonian Monte Carlo for mean-field modelsConcave-Convex PDMP-based SamplingThe Apogee to Apogee Path SamplerAdaptive tuning of Hamiltonian Monte Carlo within sequential Monte CarloModified Cholesky Riemann manifold Hamiltonian Monte Carlo: exploiting sparsity for fast sampling of high-dimensional targetsMarkov chain Monte Carlo algorithms with sequential proposalsLocalization for MCMC: sampling high-dimensional posterior distributions with local structureOptimal scaling for the transient phase of Metropolis Hastings algorithms: the longtime behaviorLeveraging Bayesian analysis to improve accuracy of approximate modelsSimulating Coulomb and log-gases with hybrid Monte Carlo algorithmsMALA-within-Gibbs Samplers for High-Dimensional Distributions with Sparse Conditional StructureMixing of Hamiltonian Monte Carlo on strongly log-concave distributions: continuous dynamicsSymmetrically Processed Splitting Integrators for Enhanced Hamiltonian Monte Carlo SamplingMaximum Conditional Entropy Hamiltonian Monte Carlo SamplerRejoinder: Geodesic Monte Carlo on Embedded ManifoldsWeak convergence and optimal tuning of the reversible jump algorithmParameter estimation in stochastic differential equations with Markov chain Monte Carlo and non-linear Kalman filteringHierarchical models and tuning of random walk Metropolis algorithmsAdaptive Thermostats for Noisy Gradient SystemsLeave Pima Indians alone: binary regression as a benchmark for Bayesian computationExtra chance generalized hybrid Monte CarloOn the stability of sequential Monte Carlo methods in high dimensionsAsymptotic variance for random walk Metropolis chains in high dimensions: logarithmic growth via the Poisson equationBayesian computation: a summary of the current state, and samples backwards and forwardsMCMC methods for functions: modifying old algorithms to make them fasterRandom walk Metropolis algorithm in high dimension with non-Gaussian target distributionsTwo-scale coupling for preconditioned Hamiltonian Monte Carlo in infinite dimensionsModified Hamiltonian Monte Carlo for Bayesian inferenceRecycling intermediate steps to improve Hamiltonian Monte CarloAdaptive Step Size Selection for Hessian-Based Manifold Langevin SamplersNoisy Hamiltonian Monte Carlo for Doubly Intractable DistributionsGeometric integrators and the Hamiltonian Monte Carlo methodAccelerated Dimension-Independent Adaptive MetropolisRandomized Hamiltonian Monte Carlo as scaling limit of the bouncy particle sampler and dimension-free convergence ratesSpatial voting models in circular spaces: a case study of the U.S. House of RepresentativesOn the geometric ergodicity of Hamiltonian Monte CarloAdaptive random neighbourhood informed Markov chain Monte Carlo for high-dimensional Bayesian variable selectionSemi-supervised nonparametric Bayesian modelling of spatial proteomicsSplit Hamiltonian Monte Carlo revisitedMixing rates for Hamiltonian Monte Carlo algorithms in finite and infinite dimensionsSkew brownian motion and complexity of the alps algorithmOptimal scaling for the transient phase of the random walk Metropolis algorithm: the mean-field limit



Cites Work


This page was built for publication: Optimal tuning of the hybrid Monte Carlo algorithm