A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions
From MaRDI portal
Publication:276855
DOI10.1007/S10589-015-9801-1zbMath1338.65164arXiv1405.4371OpenAlexW2225449929WikidataQ115384047 ScholiaQ115384047MaRDI QIDQ276855
Publication date: 4 May 2016
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1405.4371
global convergenceconjugate gradient methodRiemannian optimizationscaled vector transportweak Wolfe conditions
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Related Items (30)
Riemannian Conjugate Gradient Methods: General Framework and Specific Algorithms with Convergence Analyses ⋮ HSH-norm optimal MOR for the MIMO linear time-invariant systems on the Stiefel manifold ⋮ Riemannian stochastic fixed point optimization algorithm ⋮ Riemannian conjugate gradient methods for computing the extreme eigenvalues of symmetric tensors ⋮ Optimizing Oblique Projections for Nonlinear Systems using Trajectories ⋮ A limited-memory Riemannian symmetric rank-one trust-region method with a restart strategy ⋮ Cholesky QR-based retraction on the generalized Stiefel manifold ⋮ Adaptive trust-region method on Riemannian manifold ⋮ Riemannian conjugate gradient method for low-rank tensor completion ⋮ Riemannian optimization on unit sphere with \(p\)-norm and its applications ⋮ A communication-efficient and privacy-aware distributed algorithm for sparse PCA ⋮ Solving PhaseLift by Low-Rank Riemannian Optimization Methods for Complex Semidefinite Constraints ⋮ Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization ⋮ A hybrid Riemannian conjugate gradient method for nonconvex optimization problems ⋮ Cayley-transform-based gradient and conjugate gradient algorithms on Grassmann manifolds ⋮ First Order Methods for Optimization on Riemannian Manifolds ⋮ Mini-workshop: Computational optimization on manifolds. Abstracts from the mini-workshop held November 15--21, 2020 (online meeting) ⋮ Intrinsic representation of tangent vectors and vector transports on matrix manifolds ⋮ A Geometric Nonlinear Conjugate Gradient Method for Stochastic Inverse Eigenvalue Problems ⋮ A Riemannian conjugate gradient method for optimization on the Stiefel manifold ⋮ Riemannian conjugate gradient methods with inverse retraction ⋮ Hybrid Riemannian conjugate gradient methods with global convergence properties ⋮ A Riemannian Fletcher--Reeves Conjugate Gradient Method for Doubly Stochastic Inverse Eigenvalue Problems ⋮ Sufficient descent Riemannian conjugate gradient methods ⋮ On matrix exponentials and their approximations related to optimization on the Stiefel manifold ⋮ Blind Deconvolution by a Steepest Descent Algorithm on a Quotient Manifold ⋮ Global convergence of Riemannian line search methods with a Zhang-Hager-type condition ⋮ A Riemannian gradient ascent algorithm with applications to orthogonal approximation problems of symmetric tensors ⋮ H2 optimal model order reduction on the Stiefel manifold for the MIMO discrete system by the cross Gramian ⋮ Global convergence of Hager-Zhang type Riemannian conjugate gradient method
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
- The Geometry of Algorithms with Orthogonality Constraints
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A new, globally convergent Riemannian conjugate gradient method
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
This page was built for publication: A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions