A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression
From MaRDI portal
Publication:6114952
DOI10.1016/j.ejco.2022.100044zbMath1530.90082arXiv2201.08568MaRDI QIDQ6114952
C. W. Royer, Rémi Chan-Renous-Legoubin
Publication date: 12 July 2023
Published in: EURO Journal on Computational Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2201.08568
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Related Items
A truncated three-term conjugate gradient method with complexity guarantees with applications to nonconvex regression problem ⋮ A modified PRP-type conjugate gradient algorithm with complexity analysis and its application to image restoration problems
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Conjugate gradient algorithms in nonconvex optimization
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Conjugate gradient methods with Armijo-type line searches.
- Lower bounds for finding stationary points II: first-order methods
- A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Cubic regularization of Newton method and its global performance
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Algorithm 851
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Accelerated Methods for NonConvex Optimization
- Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization
- The Fitting of Power Series, Meaning Polynomials, Illustrated on Band-Spectroscopic Data
- Nonlinear stepsize control, trust regions and regularizations for unconstrained optimization
- Benchmarking Derivative-Free Optimization Algorithms
- Worst-case evaluation complexity of non-monotone gradient-related algorithms for unconstrained optimization
- The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization
- Convergence conditions, line search algorithms and trust region implementations for the Polak–Ribière conjugate gradient method
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Trust-Region Newton-CG with Strong Second-Order Complexity Guarantees for Nonconvex Optimization
- An inexact regularized Newton framework with a worst-case iteration complexity of $ {\mathscr O}(\varepsilon^{-3/2}) $ for nonconvex optimization
- Benchmarking optimization software with performance profiles.