Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization
From MaRDI portal
Publication:6161550
DOI10.1007/s10957-023-02183-7zbMath1515.65159MaRDI QIDQ6161550
Hiroshi Yabe, Masashi Takemura, Yasushi Narushima, Shummin Nakayama
Publication date: 27 June 2023
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Numerical mathematical programming methods (65K05) Methods of quasi-Newton type (90C53) Programming in abstract spaces (90C48)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions
- Riemannian optimization for registration of curves in elastic shape analysis
- Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization
- Low-rank tensor completion by Riemannian optimization
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Fisher lecture: Dimension reduction in regression
- Spectral scaling BFGS method
- Spectral-scaling quasi-Newton methods with updates from the one parameter of the Broyden family
- Simple algorithms for optimization on Riemannian manifolds with constraints
- Riemannian conjugate gradient methods with inverse retraction
- Hybrid Riemannian conjugate gradient methods with global convergence properties
- Sufficient descent Riemannian conjugate gradient methods
- Sequential optimality conditions for nonlinear optimization on Riemannian manifolds and a globally convergent augmented Lagrangian method
- Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization
- A Riemannian symmetric rank-one trust-region method
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
- Trust-region methods on Riemannian manifolds
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Optimization theory and methods. Nonlinear programming
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization
- Pymanopt: A Python Toolbox for Optimization on Manifolds using Automatic Differentiation
- Low-Rank Matrix Completion by Riemannian Optimization
- Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
- Manopt, a Matlab toolbox for optimization on manifolds
- A Broyden Class of Quasi-Newton Methods for Riemannian Optimization
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Algorithm 851
- Quasi-Newton Algorithms with Updates from the Preconvex Part of Broyden's Family
- Updating Quasi-Newton Matrices with Limited Storage
- Conjugate Gradient Methods with Inexact Searches
- Descent three-term conjugate gradient methods based on secant conditions for unconstrained optimization
- A Riemannian BFGS Method Without Differentiated Retraction for Nonconvex Optimization Problems
- A MEMORYLESS SYMMETRIC RANK-ONE METHOD WITH SUFFICIENT DESCENT PROPERTY FOR UNCONSTRAINED OPTIMIZATION
- On measure functions for the self-scaling updating formulae for quasi-newton methods∗
- An Introduction to Optimization on Smooth Manifolds
- Sequential Quadratic Optimization for Nonlinear Optimization Problems on Riemannian Manifolds
- Global rates of convergence for nonconvex optimization on manifolds
- Riemannian Optimization and Its Applications
- A new, globally convergent Riemannian conjugate gradient method
- Likelihood-Based Sufficient Dimension Reduction
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Some descent three-term conjugate gradient methods and their global convergence
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.