Sufficient descent Riemannian conjugate gradient methods
From MaRDI portal
Publication:2046556
DOI10.1007/s10957-021-01874-3zbMath1472.65072arXiv2009.01451OpenAlexW3166589644WikidataQ115382534 ScholiaQ115382534MaRDI QIDQ2046556
Hideaki Iiduka, Hiroyuki Sakai
Publication date: 18 August 2021
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2009.01451
sufficient descent conditionline search algorithmstrong Wolfe conditionsRiemannian conjugate gradient method
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Differentiable mappings in differential topology (57R35)
Related Items
Riemannian Conjugate Gradient Methods: General Framework and Specific Algorithms with Convergence Analyses, Riemannian optimization on unit sphere with \(p\)-norm and its applications, Faster Riemannian Newton-type optimization by subsampling and cubic regularization, A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model, Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization, Global convergence of Hager-Zhang type Riemannian conjugate gradient method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions
- Global convergence result for conjugate gradient methods
- A Riemannian conjugate gradient method for optimization on the Stiefel manifold
- Hybrid Riemannian conjugate gradient methods with global convergence properties
- Pymanopt: A Python Toolbox for Optimization on Manifolds using Automatic Differentiation
- Low-Rank Matrix Completion by Riemannian Optimization
- Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Optimization Techniques on Riemannian Manifolds
- Line Search Algorithms for Locally Lipschitz Functions on Riemannian Manifolds
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A new, globally convergent Riemannian conjugate gradient method
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization