A fast gradient and function sampling method for finite-max functions
From MaRDI portal
Abstract: This paper tackles the unconstrained minimization of a class of nonsmooth and nonconvex functions that can be written as finite max-functions. A gradient and function-based sampling method is proposed which, under special circumstances, either moves superlinearly to a minimizer of the problem of interest or superlinearly improves the optimality certificate. Global and local convergence analysis are presented, as well as illustrative examples that corroborate and elucidate the obtained theoretical results.
Recommendations
- On the local convergence analysis of the gradient sampling method for finite max-functions
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- An approximate subgradient algorithm for unconstrained nonsmooth, nonconvex optimization
- A gradient sampling method based on ideal direction for solving nonsmooth optimization problems
- An adaptive gradient sampling algorithm for non-smooth optimization
Cites work
- scientific article; zbMATH DE number 439380 (Why is no real title available?)
- scientific article; zbMATH DE number 46303 (Why is no real title available?)
- scientific article; zbMATH DE number 3523479 (Why is no real title available?)
- scientific article; zbMATH DE number 1113627 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 1424528 (Why is no real title available?)
- scientific article; zbMATH DE number 6276223 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A DC piecewise affine model and a bundling technique in nonconvex nonsmooth minimization
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- A \(\mathcal{VU}\)-algorithm for convex minimization
- A bundle-Newton method for nonsmooth unconstrained minimization
- A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees
- A science fiction story in nonsmooth optimization originating at IIASA
- A sequential quadratic programming algorithm for nonconvex, nonsmooth constrained optimization
- A smooth method for the finite minimax problem
- Active Sets, Nonsmoothness, and Sensitivity
- An Improved Successive Linear Programming Algorithm
- An adaptive gradient sampling algorithm for non-smooth optimization
- Approximating Subdifferentials by Random Sampling of Gradients
- Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- Evaluating Derivatives
- Global and superlinear convergence of an algorithm for one-dimensional minimization of convex functions
- Identifying structure of nonsmooth convex functions by the bundle technique
- Methods of descent for nondifferentiable optimization
- New limited memory bundle method for large-scale nonsmooth optimization
- Newton methods for nonsmooth convex minimization: connections among \(\mathcal U\)-Lagrangian, Riemannian Newton and SQP methods
- Nonsmooth approach to optimization problems with equilibrium constraints. Theory, applications and numerical results
- Nonsmooth mechanics and applications
- Nonsmooth optimization via quasi-Newton methods
- Nonsmooth spectral gradient methods for unconstrained optimization
- Numerical optimization. Theoretical and practical aspects. Transl. from the French
- On the differentiability check in gradient sampling methods
- On the local convergence analysis of the gradient sampling method for finite max-functions
- Optimization of lipschitz continuous functions
- Optimizing condition numbers
- Piecewise linear approximations in nonconvex nonsmooth optimization
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Restricted Step and Levenberg–Marquardt Techniques in Proximal Bundle Methods for Nonconvex Nondifferentiable Optimization
- SNOPT: An SQP Algorithm for Large-Scale Constrained Optimization
- Spectral projected subgradient with a momentum term for the Lagrangean dual approach
- Survey of Bundle Methods for Nonsmooth Optimization
- The Cutting-Plane Method for Solving Convex Programs
- The 𝒰-Lagrangian of a convex function
Cited in
(8)- A gradient sampling method based on ideal direction for solving nonsmooth optimization problems
- A new sequential optimality condition for constrained nonsmooth optimization
- On the local convergence analysis of the gradient sampling method for finite max-functions
- A quasi-Newton proximal bundle method using gradient sampling technique for minimizing nonsmooth convex functions
- A new method based on the proximal bundle idea and gradient sampling technique for minimizing nonsmooth convex functions
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- scientific article; zbMATH DE number 3956340 (Why is no real title available?)
- A primal nonsmooth reformulation for bilevel optimization problems
This page was built for publication: A fast gradient and function sampling method for finite-max functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1756577)