A hybrid Riemannian conjugate gradient method for nonconvex optimization problems
From MaRDI portal
Publication:2700130
DOI10.1007/s12190-022-01772-5OpenAlexW4285891457WikidataQ115377078 ScholiaQ115377078MaRDI QIDQ2700130
Xianglin Rong, Shajie Xing, Jin-Bao Jian, Chun-Ming Tang
Publication date: 20 April 2023
Published in: Journal of Applied Mathematics and Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s12190-022-01772-5
global convergenceconjugate gradient methodretractionRiemannian optimizationvector transportRiemannian Wolfe conditions
Uses Software
Cites Work
- Unnamed Item
- A new three-term conjugate gradient algorithm for unconstrained optimization
- A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions
- An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- The convergence properties of some new conjugate gradient methods
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- A note about WYL's conjugate gradient method and its applications
- Global convergence result for conjugate gradient methods
- A Riemannian conjugate gradient method for optimization on the Stiefel manifold
- Riemannian conjugate gradient methods with inverse retraction
- Hybrid Riemannian conjugate gradient methods with global convergence properties
- A brief introduction to manifold optimization
- Cayley-transform-based gradient and conjugate gradient algorithms on Grassmann manifolds
- Global optimization with orthogonality constraints via stochastic diffusion on manifold
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Pymanopt: A Python Toolbox for Optimization on Manifolds using Automatic Differentiation
- Low-Rank Matrix Completion by Riemannian Optimization
- Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
- Manopt, a Matlab toolbox for optimization on manifolds
- A Broyden Class of Quasi-Newton Methods for Riemannian Optimization
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Optimization Techniques on Riemannian Manifolds
- A Riemannian BFGS Method Without Differentiated Retraction for Nonconvex Optimization Problems
- Line Search Algorithms for Locally Lipschitz Functions on Riemannian Manifolds
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Deterministic Guarantees for Burer‐Monteiro Factorizations of Smooth Semidefinite Programs
- Riemannian Optimization and Its Applications
- A new, globally convergent Riemannian conjugate gradient method
- A new spectral conjugate gradient method for large-scale unconstrained optimization
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization
This page was built for publication: A hybrid Riemannian conjugate gradient method for nonconvex optimization problems