A new, globally convergent Riemannian conjugate gradient method
From MaRDI portal
Publication:5248217
DOI10.1080/02331934.2013.836650zbMath1311.65072arXiv1302.0125OpenAlexW3098671196WikidataQ115301404 ScholiaQ115301404MaRDI QIDQ5248217
Publication date: 28 April 2015
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1302.0125
global convergenceconjugate gradient methodWolfe conditionsRiemannian optimization`scaled' vector transport
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Related Items (43)
A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions ⋮ Riemannian Conjugate Gradient Methods: General Framework and Specific Algorithms with Convergence Analyses ⋮ Riemannian optimization approach to structure-preserving model order reduction of integral-differential systems on the product of two Stiefel manifolds ⋮ Unnamed Item ⋮ Riemannian stochastic fixed point optimization algorithm ⋮ Riemannian conjugate gradient methods for computing the extreme eigenvalues of symmetric tensors ⋮ Optimizing Oblique Projections for Nonlinear Systems using Trajectories ⋮ Transportless conjugate gradient for optimization on Stiefel manifold ⋮ A limited-memory Riemannian symmetric rank-one trust-region method with a restart strategy ⋮ Completely positive factorization by a Riemannian smoothing method ⋮ Riemannian stochastic variance-reduced cubic regularized Newton method for submanifold optimization ⋮ Convergence of Gradient-Based Block Coordinate Descent Algorithms for Nonorthogonal Joint Approximate Diagonalization of Matrices ⋮ Adaptive trust-region method on Riemannian manifold ⋮ Proximal gradient algorithm with trust region scheme on Riemannian manifold ⋮ Riemannian conjugate gradient method for low-rank tensor completion ⋮ Riemannian optimization on unit sphere with \(p\)-norm and its applications ⋮ Faster Riemannian Newton-type optimization by subsampling and cubic regularization ⋮ Solving PhaseLift by Low-Rank Riemannian Optimization Methods for Complex Semidefinite Constraints ⋮ Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization ⋮ Riemannian Modified Polak--Ribière--Polyak Conjugate Gradient Order Reduced Model by Tensor Techniques ⋮ Multiobjective conjugate gradient methods on Riemannian manifolds ⋮ Generalized left-localized Cayley parametrization for optimization with orthogonality constraints ⋮ A hybrid Riemannian conjugate gradient method for nonconvex optimization problems ⋮ A Riemannian derivative-free Polak-Ribiére-Polyak method for tangent vector field ⋮ Cayley-transform-based gradient and conjugate gradient algorithms on Grassmann manifolds ⋮ AUV based source seeking with estimated gradients ⋮ Intrinsic representation of tangent vectors and vector transports on matrix manifolds ⋮ A Geometric Nonlinear Conjugate Gradient Method for Stochastic Inverse Eigenvalue Problems ⋮ A Riemannian conjugate gradient method for optimization on the Stiefel manifold ⋮ Riemannian conjugate gradient methods with inverse retraction ⋮ Hybrid Riemannian conjugate gradient methods with global convergence properties ⋮ A Riemannian Fletcher--Reeves Conjugate Gradient Method for Doubly Stochastic Inverse Eigenvalue Problems ⋮ A Riemannian structure for correlation matrices ⋮ Sufficient descent Riemannian conjugate gradient methods ⋮ Spectral residual method for nonlinear equations on Riemannian manifolds ⋮ On matrix exponentials and their approximations related to optimization on the Stiefel manifold ⋮ Blind Deconvolution by a Steepest Descent Algorithm on a Quotient Manifold ⋮ Global convergence of Riemannian line search methods with a Zhang-Hager-type condition ⋮ A Riemannian gradient ascent algorithm with applications to orthogonal approximation problems of symmetric tensors ⋮ Global convergence of Hager-Zhang type Riemannian conjugate gradient method ⋮ A generalized geometric spectral conjugate gradient algorithm for finding zero of a monotone tangent vector field on a constant curvature Hadamard manifold ⋮ Low-rank matrix completion via preconditioned optimization on the Grassmann manifold ⋮ Sequential optimality conditions for nonlinear optimization on Riemannian manifolds and a globally convergent augmented Lagrangian method
Cites Work
- Projection-like Retractions on Matrix Manifolds
- Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- The Geometry of Algorithms with Orthogonality Constraints
- A Riemannian Optimization Approach to the Matrix Singular Value Decomposition
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
This page was built for publication: A new, globally convergent Riemannian conjugate gradient method