Hybrid Riemannian conjugate gradient methods with global convergence properties
From MaRDI portal
Publication:2023690
DOI10.1007/s10589-020-00224-9zbMath1466.90123arXiv2002.01644OpenAlexW3083685597WikidataQ115384041 ScholiaQ115384041MaRDI QIDQ2023690
Hideaki Iiduka, Hiroyuki Sakai
Publication date: 3 May 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2002.01644
global convergenceconjugate gradient methodRiemannian optimizationhybrid conjugate gradient methodstrong Wolfe conditions
Related Items (8)
Riemannian Conjugate Gradient Methods: General Framework and Specific Algorithms with Convergence Analyses ⋮ Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization ⋮ A hybrid Riemannian conjugate gradient method for nonconvex optimization problems ⋮ Mini-workshop: Computational optimization on manifolds. Abstracts from the mini-workshop held November 15--21, 2020 (online meeting) ⋮ Sufficient descent Riemannian conjugate gradient methods ⋮ hybrid-rcg ⋮ Global convergence of Riemannian line search methods with a Zhang-Hager-type condition ⋮ Global convergence of Hager-Zhang type Riemannian conjugate gradient method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions
- Efficient hybrid conjugate gradient techniques
- Global convergence result for conjugate gradient methods
- Global optimization with orthogonality constraints via stochastic diffusion on manifold
- Pymanopt: A Python Toolbox for Optimization on Manifolds using Automatic Differentiation
- Low-Rank Matrix Completion by Riemannian Optimization
- Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Optimization Techniques on Riemannian Manifolds
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A new, globally convergent Riemannian conjugate gradient method
- Maxima for Graphs and a New Proof of a Theorem of Turán
- Function minimization by conjugate gradients
- Analysis Operator Learning and its Application to Image Reconstruction
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization
This page was built for publication: Hybrid Riemannian conjugate gradient methods with global convergence properties