A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds
From MaRDI portal
Publication:2288191
DOI10.1007/s10107-018-1328-7zbMath1437.90130OpenAlexW2892093336MaRDI QIDQ2288191
Serge Gratton, C. W. Royer, Luis Nunes Vicente
Publication date: 17 January 2020
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/10316/89471
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Derivative-free methods and methods using generalized derivatives (90C56)
Related Items
An active set trust-region method for bound-constrained optimization, Detecting negative eigenvalues of exact and approximate Hessian matrices in optimization, An adaptive regularization method in Banach spaces, Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization, Regional complexity analysis of algorithms for nonconvex smooth optimization, A concise second-order complexity analysis for unconstrained optimization using high-order regularized models, Derivative-free optimization methods
Uses Software
Cites Work
- Nonlinear stepsize control algorithms: complexity bounds for first- and second-order optimality
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Combining and scaling descent and negative curvature directions
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Complexity bounds for second-order optimality in unconstrained optimization
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization
- Recent advances in trust region algorithms
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Nonconvex optimization using negative curvature within a modified linesearch
- A second-order globally convergent direct-search method and its worst-case complexity
- Trust-Region Methods Without Using Derivatives: Worst Case Complexity and the NonSmooth Case
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- Introduction to Derivative-Free Optimization
- A Family of Trust-Region-Based Algorithms for Unconstrained Minimization with Strong Global Convergence Properties
- Newton’s Method with a Model Trust Region Modification
- On the use of directions of negative curvature in a modified newton method
- Trust Region Methods
- Exploiting negative curvature directions in linesearch methods for unconstrained optimization
- Accelerated Methods for NonConvex Optimization
- Finding approximate local minima faster than gradient descent
- Benchmarking Derivative-Free Optimization Algorithms
- The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization
- Benchmarking optimization software with performance profiles.