An adaptive gradient sampling algorithm for non-smooth optimization
From MaRDI portal
Publication:2867436
DOI10.1080/10556788.2012.714781zbMath1284.49036OpenAlexW2084788303MaRDI QIDQ2867436
Publication date: 19 December 2013
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2012.714781
unconstrained optimizationglobal convergencenonsmooth optimizationnonconvex optimizationClarke subdifferentialgradient samplingquadratic subproblemline-search methodswarm-starting
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (23)
A gradient sampling method based on ideal direction for solving nonsmooth optimization problems ⋮ A quasi-Newton proximal bundle method using gradient sampling technique for minimizing nonsmooth convex functions ⋮ Limited-memory BFGS with displacement aggregation ⋮ A conjugate gradient sampling method for nonsmooth optimization ⋮ A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization ⋮ Nonsmooth spectral gradient methods for unconstrained optimization ⋮ A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees ⋮ A New Sequential Optimality Condition for Constrained Nonsmooth Optimization ⋮ A Nonsmooth Trust-Region Method for Locally Lipschitz Functions with Application to Optimization Problems Constrained by Variational Inequalities ⋮ An SQP method for minimization of locally Lipschitz functions with nonlinear constraints ⋮ A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization ⋮ Manifold Sampling for Optimizing Nonsmooth Nonconvex Compositions ⋮ An adaptive competitive penalty method for nonsmooth constrained optimization ⋮ A fast gradient and function sampling method for finite-max functions ⋮ A convergence analysis of the method of codifferential descent ⋮ A new method based on the proximal bundle idea and gradient sampling technique for minimizing nonsmooth convex functions ⋮ An efficient descent method for locally Lipschitz multiobjective optimization problems ⋮ Derivative-free robust optimization by outer approximations ⋮ On the differentiability check in gradient sampling methods ⋮ Smoothing SQP Methods for Solving Degenerate Nonsmooth Constrained Optimization Problems with Applications to Bilevel Programs ⋮ A geometric integration approach to nonsmooth, nonconvex optimisation ⋮ Manifold Sampling for $\ell_1$ Nonconvex Optimization ⋮ An inexact restoration-nonsmooth algorithm with variable accuracy for stochastic nonsmooth convex optimization problems in machine learning and stochastic linear complementarity problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Methods of descent for nondifferentiable optimization
- A Sequential Quadratic Programming Algorithm for Nonconvex, Nonsmooth Constrained Optimization
- A Nonderivative Version of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- Optimization and nonsmooth analysis
- A Method for Solving Certain Quadratic Programming Problems Arising in Nonsmooth Optimization
- Optimization of lipschitz continuous functions
- Pseudospectral Components and the Distance to Uncontrollability
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- Minimizing the Condition Number for Small Rank Modifications
- New limited memory bundle method for large-scale nonsmooth optimization
- Convex Analysis
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Approximating Subdifferentials by Random Sampling of Gradients
- Benchmarking optimization software with performance profiles.
This page was built for publication: An adaptive gradient sampling algorithm for non-smooth optimization