Gradient method for optimization on Riemannian manifolds with lower bounded curvature

From MaRDI portal
Publication:5237307

DOI10.1137/18M1180633zbMATH Open1429.90051arXiv1806.02694OpenAlexW2981099578WikidataQ115246931 ScholiaQ115246931MaRDI QIDQ5237307FDOQ5237307


Authors: Maurício Silva Louzeiro, L. F. Prudente, O. P. Ferreira Edit this on Wikidata


Publication date: 17 October 2019

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Abstract: The gradient method for minimize a differentiable convex function on Riemannian manifolds with lower bounded sectional curvature is analyzed in this paper. The analysis of the method is presented with three different finite procedures for determining the stepsize, namely, Lipschitz stepsize, adaptive stepsize and Armijo's stepsize. The first procedure requires that the objective function has Lipschitz continuous gradient, which is not necessary for the other approaches. Convergence of the whole sequence to a minimizer, without any level set boundedness assumption, is proved. Iteration-complexity bound for functions with Lipschitz continuous gradient is also presented. Numerical experiments are provided to illustrate the effectiveness of the method in this new setting and certify the obtained theoretical results. In particular, we consider the problem of finding the Riemannian center of mass and the so-called Karcher's mean. Our numerical experiences indicate that the adaptive stepsize is a promising scheme that is worth considering.


Full work available at URL: https://arxiv.org/abs/1806.02694




Recommendations




Cites Work


Cited In (34)

Uses Software





This page was built for publication: Gradient method for optimization on Riemannian manifolds with lower bounded curvature

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5237307)