A second-order globally convergent direct-search method and its worst-case complexity
From MaRDI portal
Publication:2810113
DOI10.1080/02331934.2015.1124271zbMath1338.90463OpenAlexW2336146308WikidataQ58040477 ScholiaQ58040477MaRDI QIDQ2810113
C. W. Royer, Serge Gratton, Luis Nunes Vicente
Publication date: 31 May 2016
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: http://oatao.univ-toulouse.fr/16988/1/gratton_16988.pdf
Abstract computational complexity for mathematical programming problems (90C60) Derivative-free methods and methods using generalized derivatives (90C56) Numerical methods based on necessary conditions (49M05)
Related Items
Detecting negative eigenvalues of exact and approximate Hessian matrices in optimization, Generating set search using simplex gradients for bound-constrained black-box optimization, A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds, Derivative-free optimization methods
Cites Work
- Combining and scaling descent and negative curvature directions
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Complexity bounds for second-order optimality in unconstrained optimization
- Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization
- Worst case complexity of direct search
- Mesh adaptive direct search with second directional derivative-based Hessian update
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- A generating set search method using curvature information
- Nonconvex optimization using negative curvature within a modified linesearch
- Smoothing and worst-case complexity for direct-search methods in nonsmooth optimization
- On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization
- A subclass of generating set search with convergence to second-order stationary points
- Coordinate search algorithms in multilevel optimization
- On the Convergence of Pattern Search Algorithms
- Convergence of Mesh Adaptive Direct Search to Second‐Order Stationary Points
- Recursive Trust-Region Methods for Multiscale Nonlinear Optimization
- Introduction to Derivative-Free Optimization
- A superlinearly convergent algorithm for minimization without evaluating derivatives
- On the use of directions of negative curvature in a modified newton method
- Curvilinear Stabilization Techniques for Truncated Newton Methods in Large Scale Unconstrained Optimization
- Exploiting negative curvature directions in linesearch methods for unconstrained optimization
- Benchmarking Derivative-Free Optimization Algorithms
- Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points
- Second-Order Behavior of Pattern Search
- Mesh Adaptive Direct Search Algorithms for Constrained Optimization
- Direct Search Based on Probabilistic Descent
- Theory of Positive Linear Dependence
- Benchmarking optimization software with performance profiles.