Riemannian conjugate gradient methods with inverse retraction
From MaRDI portal
Publication:2023689
DOI10.1007/s10589-020-00219-6zbMath1466.90124OpenAlexW3061101525WikidataQ115384043 ScholiaQ115384043MaRDI QIDQ2023689
Publication date: 3 May 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-020-00219-6
Stiefel manifoldconjugate gradient methodretractionRiemannian optimizationfixed-rank manifoldinverse retraction
Related Items
Riemannian Conjugate Gradient Methods: General Framework and Specific Algorithms with Convergence Analyses, Sequential Quadratic Optimization for Nonlinear Optimization Problems on Riemannian Manifolds, A Riemannian subspace BFGS trust region method, A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds, Adaptive trust-region method on Riemannian manifold, Riemannian optimization on unit sphere with \(p\)-norm and its applications, Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization, Generalized left-localized Cayley parametrization for optimization with orthogonality constraints, A hybrid Riemannian conjugate gradient method for nonconvex optimization problems, Approximated logarithmic maps on Riemannian manifolds and their applications, Riemannian optimization with a preconditioning scheme on the generalized Stiefel manifold, Sequential optimality conditions for nonlinear optimization on Riemannian manifolds and a globally convergent augmented Lagrangian method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A feasible method for optimization with orthogonality constraints
- \(\varepsilon\)-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds
- A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions
- Low-rank retractions: a survey and new results
- A brief introduction to manifold optimization
- A Riemannian symmetric rank-one trust-region method
- Trust-region methods on Riemannian manifolds
- Optimization theory and methods. Nonlinear programming
- Low-Rank Matrix Completion by Riemannian Optimization
- Projection-like Retractions on Matrix Manifolds
- Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
- Manopt, a Matlab toolbox for optimization on manifolds
- A Broyden Class of Quasi-Newton Methods for Riemannian Optimization
- A Riemannian Gradient Sampling Algorithm for Nonsmooth Optimization on Manifolds
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- The Geometry of Algorithms with Orthogonality Constraints
- Variational Analysis
- Empirical Arithmetic Averaging Over the Compact Stiefel Manifold
- A Riemannian BFGS Method Without Differentiated Retraction for Nonconvex Optimization Problems
- Line Search Algorithms for Locally Lipschitz Functions on Riemannian Manifolds
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Riemannian Stochastic Variance Reduced Gradient Algorithm with Retraction and Vector Transport
- A Gradient Sampling Method on Algebraic Varieties and Application to Nonsmooth Low-Rank Optimization
- A new, globally convergent Riemannian conjugate gradient method
- Function minimization by conjugate gradients
- Stochastic Gradient Descent on Riemannian Manifolds
- Dynamical Low‐Rank Approximation