Gradient method for optimization on Riemannian manifolds with lower bounded curvature
From MaRDI portal
Publication:5237307
Abstract: The gradient method for minimize a differentiable convex function on Riemannian manifolds with lower bounded sectional curvature is analyzed in this paper. The analysis of the method is presented with three different finite procedures for determining the stepsize, namely, Lipschitz stepsize, adaptive stepsize and Armijo's stepsize. The first procedure requires that the objective function has Lipschitz continuous gradient, which is not necessary for the other approaches. Convergence of the whole sequence to a minimizer, without any level set boundedness assumption, is proved. Iteration-complexity bound for functions with Lipschitz continuous gradient is also presented. Numerical experiments are provided to illustrate the effectiveness of the method in this new setting and certify the obtained theoretical results. In particular, we consider the problem of finding the Riemannian center of mass and the so-called Karcher's mean. Our numerical experiences indicate that the adaptive stepsize is a promising scheme that is worth considering.
Recommendations
- Iteration-complexity of the subgradient method on Riemannian manifolds with lower bounded curvature
- Subgradient algorithms on Riemannian manifolds of lower bounded curvatures
- Convergence Analysis of Gradient Algorithms on Riemannian Manifolds without Curvature Constraints and Application to Riemannian Mass
- First order methods for optimization on Riemannian manifolds
- Iteration-complexity of gradient, subgradient and proximal point methods on Riemannian manifolds
Cites work
- scientific article; zbMATH DE number 52737 (Why is no real title available?)
- scientific article; zbMATH DE number 1246686 (Why is no real title available?)
- scientific article; zbMATH DE number 1282147 (Why is no real title available?)
- scientific article; zbMATH DE number 681023 (Why is no real title available?)
- scientific article; zbMATH DE number 1104271 (Why is no real title available?)
- scientific article; zbMATH DE number 1104279 (Why is no real title available?)
- scientific article; zbMATH DE number 909255 (Why is no real title available?)
- scientific article; zbMATH DE number 5223994 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A framework for generalising the Newton method and other iterative methods from Euclidean space to manifolds
- A survey and comparison of contemporary algorithms for computing the matrix geometric mean
- Benchmarking optimization software with performance profiles.
- Computing the Karcher mean of symmetric positive definite matrices
- Concepts and techniques of optimization on the sphere
- Conic geometric optimization on the manifold of positive definite matrices
- Contributions to the study of monotone vector fields
- Domains of positivity
- Full convergence of the steepest descent method with inexact line searches
- Geometric means
- Global rates of convergence for nonconvex optimization on manifolds
- Gradient methods for minimizing composite functions
- Introductory lectures on convex optimization. A basic course.
- Iteration-complexity of gradient, subgradient and proximal point methods on Riemannian manifolds
- Lattices in spaces of nonpositive curvature
- Linear convergence of subgradient algorithm for convex feasibility on Riemannian manifolds
- Minimizing a differentiable function over a differential manifold
- Non-existence of continuous convex functions on certain Riemannian manifolds
- On the Riemannian geometry defined by self-concordant barriers and interior-point methods.
- On the convergence of gradient descent for finding the Riemannian center of mass
- Optimal placement of a deposit between markets: Riemann-Finsler geometrical approach
- Optimization Techniques on Riemannian Manifolds
- Smooth nonlinear optimization of \(\mathbb R^n\)
- Statistics on the manifold of multivariate normal distributions: theory and application to diffusion tensor MRI processing
- Step-sizes for the gradient method
- Subgradient projection algorithms for convex feasibility on Riemannian manifolds with lower bounded curvatures
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- The Geometry of Algorithms with Orthogonality Constraints
- The Gradient Projection Method Along Geodesics
- Variational inequalities for set-valued vector fields on Riemannian manifolds: convexity of the solution set and the proximal point algorithm
- Weak sharp minima on Riemannian manifolds
Cited in
(34)- Dynamical systems for solving variational inclusion and fixed point problems on Hadamard manifolds
- Constraint qualifications and optimality criteria for nonsmooth multiobjective programming problems on Hadamard manifolds
- On the relationship between the Kurdyka-Łojasiewicz property and error bounds on Hadamard manifolds
- Iteration-complexity of gradient, subgradient and proximal point methods on Riemannian manifolds
- Computing Riemannian center of mass on Hadamard manifolds
- Iterative algorithms for monotone variational inequality and fixed point problems on Hadamard manifolds
- First order methods for optimization on Riemannian manifolds
- scientific article; zbMATH DE number 6177815 (Why is no real title available?)
- Nonlinear evolution equations via resolvent operators on Hadamard manifolds
- On the convergence of gradient descent for finding the Riemannian center of mass
- scientific article; zbMATH DE number 6135092 (Why is no real title available?)
- Variational inequalities governed by strongly pseudomonotone vector fields on Hadamard manifolds
- Numerical approaches for constrained and unconstrained, static optimization on the special Euclidean group \(\mathsf{SE}(3)\)
- Proximal point method for quasiconvex functions in Riemannian manifolds
- Proximal gradient method for nonconvex and nonsmooth optimization on Hadamard manifolds
- Iteration-complexity of the subgradient method on Riemannian manifolds with lower bounded curvature
- Fenchel conjugate via Busemann function on Hadamard manifolds
- Constraint qualifications for nonsmooth multiobjective programming problems with switching constraints on Hadamard manifolds
- Path-based incremental target level algorithm on Riemannian manifolds
- A trust region method for solving multicriteria optimization problems on Riemannian manifolds
- Riemannian stochastic variance-reduced cubic regularized Newton method for submanifold optimization
- Operator-valued formulas for Riemannian gradient and Hessian and families of tractable metrics in Riemannian optimization
- Optimality conditions and duality for multiobjective semi-infinite optimization problems with switching constraints on Hadamard manifolds
- A modified proximal point method for DC functions on Hadamard manifolds
- Iterative algorithm for singularities of inclusion problems in Hadamard manifolds
- Convexity of sets and quadratic functions on the hyperbolic space
- Fenchel duality theory and a primal-dual algorithm on Riemannian manifolds
- Nonsmooth nonconvex optimization on Riemannian manifolds via bundle trust region algorithm
- Constraint qualifications and optimality conditions for nonsmooth multiobjective mathematical programming problems with vanishing constraints on Hadamard manifolds via convexificators
- Well-posedness of an interaction model on Riemannian manifolds
- A strongly convergent proximal point method for vector optimization
- Iteration-complexity and asymptotic analysis of steepest descent method for multiobjective optimization on Riemannian manifolds
- Convergence Analysis of Gradient Algorithms on Riemannian Manifolds without Curvature Constraints and Application to Riemannian Mass
- An adaptive Riemannian gradient method without function evaluations
This page was built for publication: Gradient method for optimization on Riemannian manifolds with lower bounded curvature
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5237307)