Intrinsic convergence properties of entropic sampling algorithms
From MaRDI portal
Publication:3301994
DOI10.1088/1742-5468/2014/07/P07007zbMATH Open1456.60005arXiv1404.0725MaRDI QIDQ3301994FDOQ3301994
Authors: R. E. Belardinelli, V. Pereyra, Bruno Jeferson Lourenço, Ronald Dickman
Publication date: 11 August 2020
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Abstract: We study the convergence of the density of states and thermodynamic properties in three flat-histogram simulation methods, the Wang-Landau (WL) algorithm, the 1/t algorithm, and tomographic sampling (TS). In the first case the refinement parameter f is rescaled (f -> f/2) each time the flat-histogram condition is satisfied, in the second f ~ 1/t after a suitable initial phase, while in the third f is constant (t corresponds to Monte Carlo time). To examine the intrinsic convergence properties of these methods, free of any complications associated with a specific model, we study a featureless entropy landscape, such that for each allowed energy E = 1,...,L, there is exactly one state, that is, g(E) = 1 for all E. Convergence of sampling corresponds to g(E,t) -> const. as t -> infinity, so that the standard deviation sigma_g of g over energy values is a measure of the overall sampling error. Neither the WL algorithm nor TS converge: in both cases sigma_g saturates at long times. In the 1/t algorithm, by contrast, sigma_g decays propto 1/sqrt{t}. Modified TS and 1/t procedures, in which f propto 1/t^alpha, converge for alpha values between 0 and 1. There are two essential facets to convergence of flat-histogram methods: elimination of initial errors in g(E), and correction of the sampling noise accumulated during the process. For a simple example, we demonstrate analytically, using a Langevin equation, that both kinds of errors can be eliminated, asymptotically, if f ~ 1/t^alpha with 0 < alpha leq 1. Convergence is optimal for alpha = 1. For alpha leq 0 the sampling noise never decays, while for alpha > 1 the initial error is never completely eliminated.
Full work available at URL: https://arxiv.org/abs/1404.0725
Recommendations
Computational methods for problems pertaining to probability theory (60-08) Monte Carlo methods (65C05)
Cites Work
- Numerical integration using Wang-Landau sampling
- Convergence and refinement of the Wang-Landau algorithm
- Determining the density of states for classical statistical models by a flat-histogram random walk
- Performance of Wang-Landau algorithm in continuous spin models and a case study: Modified XY-model
- Monte Carlo study of the antiferromagnetic three-state Potts model with a staggered polarization field on the square lattice
Cited In (8)
- Static critical behavior of the \(q\)-states Potts model: high-resolution entropic study
- Phase diagram and critical behavior of the antiferromagnetic Ising model in an external field
- Convergence of the equi-energy sampler
- Sampling, Metric Entropy, and Dimensionality Reduction
- Determining the density of states for classical statistical models by a flat-histogram random walk
- Convergence and refinement of the Wang-Landau algorithm
- Continuous relaxations for Constrained Maximum-Entropy Sampling
- A theory on flat histogram Monte Carlo algorithms
This page was built for publication: Intrinsic convergence properties of entropic sampling algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3301994)