Trust-Region Methods Without Using Derivatives: Worst Case Complexity and the NonSmooth Case
From MaRDI portal
Publication:2826817
DOI10.1137/151005683zbMath1348.90572OpenAlexW2528932844WikidataQ58040486 ScholiaQ58040486MaRDI QIDQ2826817
Rohollah Garmanjani, D. Júdice, Luis Nunes Vicente
Publication date: 11 October 2016
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/151005683
smoothingcomposite functionstrust-region methodsnonsmoothnessderivative-free optimization (DFO)worst case complexity (WCC)
Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56)
Related Items
A Trust-region Method for Nonsmooth Nonconvex Optimization, Survey Descent: A Multipoint Generalization of Gradient Descent for Nonsmooth Optimization, Model-Based Derivative-Free Methods for Convex-Constrained Optimization, Manifold Sampling for Optimization of Nonconvex Functions That Are Piecewise Linear Compositions of Smooth Components, ASTRO-DF: A Class of Adaptive Sampling Trust-Region Algorithms for Derivative-Free Stochastic Optimization, A Stochastic Levenberg--Marquardt Method Using Random Models with Complexity Results, A note on the worst-case complexity of nonlinear stepsize control methods for convex smooth unconstrained optimization, Scalable subspace methods for derivative-free nonlinear least-squares optimization, Direct Search Based on Probabilistic Descent in Reduced Spaces, On the construction of quadratic models for derivative-free trust-region algorithms, Complexity bound of trust-region methods for convex smooth unconstrained multiobjective optimization, Quadratic regularization methods with finite-difference gradient approximations, Worst-case evaluation complexity of a derivative-free quadratic regularization method, An indicator for the switch from derivative-free to derivative-based optimization, CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method, Manifold Sampling for Optimizing Nonsmooth Nonconvex Compositions, Optimality of orders one to three and beyond: characterization and evaluation complexity in constrained nonconvex optimization, Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization, Trust-Region Methods for the Derivative-Free Optimization of Nonsmooth Black-Box Functions, Inexact derivative-free optimization for bilevel learning, Derivative-free robust optimization by outer approximations, A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds, A derivative-free Gauss-Newton method, Worst-case evaluation complexity of derivative-free nonmonotone line search methods for solving nonlinear systems of equations, Derivative-free optimization methods, Manifold Sampling for $\ell_1$ Nonconvex Optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization
- Smoothing methods for nonsmooth, nonconvex minimization
- Worst case complexity of direct search
- Incorporating minimum Frobenius norm models in direct search
- Introductory lectures on convex optimization. A basic course.
- A class of smoothing functions for nonlinear and mixed complementarity problems
- A derivative-free trust-region algorithm for composite nonsmooth optimization
- Recent advances in trust region algorithms
- Random gradient-free minimization of convex functions
- Geometry of interpolation sets in derivative free optimization
- Cubic regularization of Newton method and its global performance
- Smoothing and worst-case complexity for direct-search methods in nonsmooth optimization
- Epi-convergent Smoothing with Applications to Convex Composite Functions
- On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization
- Smoothing Nonlinear Conjugate Gradient Method for Image Restoration Using Nonsmooth Nonconvex Minimization
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Self-Correcting Geometry in Model-Based Algorithms for Derivative-Free Unconstrained Optimization
- On the Evaluation Complexity of Composite Function Minimization with Applications to Nonconvex Nonlinear Programming
- Smoothing Projected Gradient Method and Its Application to Stochastic Linear Complementarity Problems
- On the geometry phase in model-based algorithms for derivative-free optimization
- Recursive Trust-Region Methods for Multiscale Nonlinear Optimization
- Introduction to Derivative-Free Optimization
- Conditions for convergence of trust region algorithms for nonsmooth optimization
- Variational Analysis
- Trust Region Methods
- Benchmarking Derivative-Free Optimization Algorithms
- Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points
- CUTEr and SifDec
- Optimality Measures for Performance Profiles
- Benchmarking optimization software with performance profiles.
- Worst case complexity of direct search under convexity