Is there an analog of Nesterov acceleration for gradient-based MCMC? (Q2040101)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Is there an analog of Nesterov acceleration for gradient-based MCMC?
scientific article

    Statements

    Is there an analog of Nesterov acceleration for gradient-based MCMC? (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    9 July 2021
    0 references
    In continuous optimization problems, Nesterov's accelerated gradient descent method accelerates the convergence speed of the simple gradient descent method. This paper considers an analogue of such an acceleration method in Markov chain Monte Carlo (MCMC) methods, which in general have a major issue in convergence speed. As in the accelerated gradient descent for optimization, by introducing a momentum variable, the authors derive the stochastic differential equation for the accelerated dynamics in the space of probability distributions with the Kullback-Leibler divergence to the target distribution as the objective functional. Analyzing the convergence rate of the continuous accelerated dynamics and the discretization error, the authors show that the derived accelerated sampling algorithm improves the dimension and accuracy dependence of the unadjusted Langevin algorithm, which corresponds to the unadjusted gradient decent optimization. Thus, this paper provides a solid theoretical foundation for practical accelerated MCMC methods.
    0 references
    0 references
    0 references
    0 references
    0 references
    accelerated gradient descent
    0 references
    Langevin Monte Carlo
    0 references
    Markov chain Monte Carlo (MCMC)
    0 references
    sampling algorithms
    0 references
    0 references