On the differentiability check in gradient sampling methods
DOI10.1080/10556788.2016.1178262zbMATH Open1354.65122OpenAlexW2346151177MaRDI QIDQ2829571FDOQ2829571
Authors: Elias Salomão Helou, Lucas E. A. Simões, S. A. Santos
Publication date: 8 November 2016
Published in: Optimization Methods \& Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2016.1178262
Recommendations
- Non-asymptotic guarantees for sampling by stochastic gradient descent
- Convergence of the gradient sampling algorithm on directionally Lipschitz functions
- From Optimization to Sampling Through Gradient Flows
- A nonderivative version of the gradient sampling algorithm for nonsmooth nonconvex optimization
- Approximating Subdifferentials by Random Sampling of Gradients
- Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- Auxiliary Gradient-Based Sampling Algorithms
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- A derivative-free approximate gradient sampling algorithm for finite minimax problems
- Sampling Gaussian distributions in Krylov spaces with conjugate gradients
convergencenonconvex optimizationalgorithmnumerical resultnonsmooth optimizationnonmonotone line searchgradient samplingdifferentiability check
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26)
Cites Work
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- Methods of descent for nondifferentiable optimization
- Title not available (Why is that?)
- Nonsmooth optimization via quasi-Newton methods
- Approximating Subdifferentials by Random Sampling of Gradients
- Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- A nonderivative version of the gradient sampling algorithm for nonsmooth nonconvex optimization
- An adaptive gradient sampling algorithm for non-smooth optimization
- Optimizing condition numbers
Cited In (11)
- An efficient descent method for locally Lipschitz multiobjective optimization problems
- A new sequential optimality condition for constrained nonsmooth optimization
- On the local convergence analysis of the gradient sampling method for finite max-functions
- A fast gradient and function sampling method for finite-max functions
- A quasi-Newton proximal bundle method using gradient sampling technique for minimizing nonsmooth convex functions
- A derivative-free \(\mathcal{V} \mathcal{U}\)-algorithm for convex finite-max problems
- A new method based on the proximal bundle idea and gradient sampling technique for minimizing nonsmooth convex functions
- An adaptive gradient sampling algorithm for non-smooth optimization
- Nonsmooth spectral gradient methods for unconstrained optimization
- On the convergence analysis of a penalty algorithm for nonsmooth optimization and its performance for solving hard-sphere problems
- Auxiliary Gradient-Based Sampling Algorithms
This page was built for publication: On the differentiability check in gradient sampling methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2829571)