A variational formulation of accelerated optimization on Riemannian manifolds
From MaRDI portal
Publication:5863993
Abstract: It was shown recently by Su et al. (2016) that Nesterov's accelerated gradient method for minimizing a smooth convex function can be thought of as the time discretization of a second-order ODE, and that converges to its optimal value at a rate of along any trajectory of this ODE. A variational formulation was introduced in Wibisono et al. (2016) which allowed for accelerated convergence at a rate of , for arbitrary , in normed vector spaces. This framework was exploited in Duruisseaux et al. (2021) to design efficient explicit algorithms for symplectic accelerated optimization. In Alimisis et al. (2020), a second-order ODE was proposed as the continuous-time limit of a Riemannian accelerated algorithm, and it was shown that the objective function converges to its optimal value at a rate of along solutions of this ODE. In this paper, we show that on Riemannian manifolds, the convergence rate of to its optimal value can also be accelerated to an arbitrary convergence rate , by considering a family of time-dependent Bregman Lagrangian and Hamiltonian systems on Riemannian manifolds. This generalizes the results of Wibisono et al. (2016) to Riemannian manifolds and also provides a variational framework for accelerated optimization on Riemannian manifolds. An approach based on the time-invariance property of the family of Bregman Lagrangians and Hamiltonians was used to construct very efficient optimization algorithms in Duruisseaux et al. (2021), and we establish a similar time-invariance property in the Riemannian setting. One expects that a geometric numerical integrator that is time-adaptive, symplectic, and Riemannian manifold preserving will yield a class of promising optimization algorithms on manifolds.
Recommendations
- Accelerated optimization on Riemannian manifolds via discrete constrained variational integrators
- A variational perspective on accelerated methods in optimization
- Accelerated Optimization in the PDE Framework: Formulations for the Manifold of Diffeomorphisms
- Riemannian proximal gradient methods
- scientific article; zbMATH DE number 1146390
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 3479763 (Why is no real title available?)
- scientific article; zbMATH DE number 1246686 (Why is no real title available?)
- scientific article; zbMATH DE number 1300852 (Why is no real title available?)
- scientific article; zbMATH DE number 5223994 (Why is no real title available?)
- A differential equation for modeling Nesterov's accelerated gradient method: theory and insights
- A variational perspective on accelerated methods in optimization
- Accelerated optimization on Riemannian manifolds via discrete constrained variational integrators
- Accelerating the cubic regularization of Newton's method on convex problems
- Adaptive Hamiltonian variational integrators and applications to symplectic accelerated optimization
- Discrete mechanics and variational integrators
- Geometric Numerical Integration
- Introduction to Riemannian Manifolds
- Introductory lectures on convex optimization. A basic course.
- Riemannian geometry and geometric analysis
- Variational and geometric structures of discrete Dirac mechanics
Cited in
(9)- High-order symplectic Lie group methods on \(SO(n)\) using the polar decomposition
- Time-adaptive Lagrangian variational integrators for accelerated optimization
- Practical perspectives on symplectic accelerated optimization
- Optimization via conformal Hamiltonian systems on manifolds
- Accelerated optimization on Riemannian manifolds via discrete constrained variational integrators
- Accelerated Optimization in the PDE Framework: Formulations for the Manifold of Diffeomorphisms
- Fast gradient method for low-rank matrix estimation
- A variational perspective on accelerated methods in optimization
- Bregman dynamics, contact transformations and convex optimization
This page was built for publication: A variational formulation of accelerated optimization on Riemannian manifolds
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5863993)