On the differentiability check in gradient sampling methods
DOI10.1080/10556788.2016.1178262zbMATH Open1354.65122OpenAlexW2346151177MaRDI QIDQ2829571FDOQ2829571
S. A. Santos, Lucas E. A. Simรตes, Elias Salomรฃo Helou
Publication date: 8 November 2016
Published in: Optimization Methods \& Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2016.1178262
convergencenonconvex optimizationalgorithmnumerical resultnonsmooth optimizationnonmonotone line searchgradient samplingdifferentiability check
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26)
Cites Work
- Title not available (Why is that?)
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- Methods of descent for nondifferentiable optimization
- Nonsmooth optimization via quasi-Newton methods
- Approximating Subdifferentials by Random Sampling of Gradients
- Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- A Nonderivative Version of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- An adaptive gradient sampling algorithm for non-smooth optimization
- Optimizing Condition Numbers
Cited In (11)
- An efficient descent method for locally Lipschitz multiobjective optimization problems
- On the local convergence analysis of the gradient sampling method for finite max-functions
- A fast gradient and function sampling method for finite-max functions
- A quasi-Newton proximal bundle method using gradient sampling technique for minimizing nonsmooth convex functions
- A new method based on the proximal bundle idea and gradient sampling technique for minimizing nonsmooth convex functions
- An adaptive gradient sampling algorithm for non-smooth optimization
- Nonsmooth spectral gradient methods for unconstrained optimization
- On the convergence analysis of a penalty algorithm for nonsmooth optimization and its performance for solving hard-sphere problems
- A New Sequential Optimality Condition for Constrained Nonsmooth Optimization
- Auxiliary Gradient-Based Sampling Algorithms
- A derivative-free ๐ฑ๐ฐ-algorithm for convex finite-max problems
Recommendations
- Non-asymptotic guarantees for sampling by stochastic gradient descent ๐ ๐
- Convergence of the gradient sampling algorithm on directionally Lipschitz functions ๐ ๐
- From Optimization to Sampling Through Gradient Flows ๐ ๐
- A Nonderivative Version of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization ๐ ๐
- Approximating Subdifferentials by Random Sampling of Gradients ๐ ๐
- Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization ๐ ๐
- Auxiliary Gradient-Based Sampling Algorithms ๐ ๐
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization ๐ ๐
- A derivative-free approximate gradient sampling algorithm for finite minimax problems ๐ ๐
- Sampling Gaussian distributions in Krylov spaces with conjugate gradients ๐ ๐
This page was built for publication: On the differentiability check in gradient sampling methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2829571)