Accelerated information gradient flow (Q2053344)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Accelerated information gradient flow
scientific article

    Statements

    Accelerated information gradient flow (English)
    0 references
    0 references
    0 references
    29 November 2021
    0 references
    Nesterov's accelerated gradient method for classical optimization problems in the Euclidean space is a wide-applied optimization method and it accelerates a gradient descent method. The continuous-time limit of this method is known as the accelerated gradient flow problem. In this paper, the authors present a unified framework of the accelerated gradient flow in the probability space embedded with information metrics, named accelerated information gradient (AIG). They formulate an AIG flow, under the Fisher-Rao metric, Wasserstein metric, Kalman-Wasserstein metric and Stein metric and theoretically prove the convergence rate of the AIG flow. The authors present the discrete-time algorithm for the Wasserstein AIG flow, including the Brownian motions method and the adaptive restart technique. Numerical experiments, including a Bayesian logistic regression and a Bayesian neural network, show the strength of the proposed methods compared with state-of-the-art algorithms. In supplementary materials, the authors provide discrete-time algorithms for both Kalman-Wasserstein AIG and Stein AIG flows.
    0 references
    0 references
    Nesterov's accelerated gradient method
    0 references
    Bayesian inverse problem
    0 references
    optimal transport
    0 references
    information geometry
    0 references
    0 references
    0 references