Conservative parametric optimality and the ridge method for tame min-max problems
DOI10.1007/s11228-023-00682-3zbMath1518.49031arXiv2104.00283MaRDI QIDQ6163857
Publication date: 26 July 2023
Published in: Set-Valued and Variational Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2104.00283
Clarke subdifferentialO-minimal structuresdefinable setsmin-max problemsfirst order methodsparametric optimalityconservative gradientsridge algorithm
Numerical mathematical programming methods (65K05) Minimax problems in mathematical programming (90C47) Sensitivity, stability, parametric optimization (90C31) Nonsmooth analysis (49J52) Optimality conditions for minimax problems (49K35) Robustness in mathematical programming (90C17)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Generalisations, examples, and counter-examples in analysis and optimisation. \textit{In honour of Michel Théra at 70}
- Geometric categories and o-minimal structures
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- A theorem of the complement and some new o-minimal structures
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
- Convergence of constant step stochastic gradient descent for non-smooth non-convex functions
- Stochastic subgradient method converges on tame functions
- Conservative and semismooth derivatives are equivalent for semialgebraic maps
- Generalized subdifferentials: a Baire categorical approach
- Clarke Subgradients of Stratifiable Functions
- Optimization and nonsmooth analysis
- Extensions of subgradient calculus with applications to optimization
- Variational Analysis
- A Chain Rule for Essentially Smooth Lipschitz Functions
- The Structure of Conservative Gradient Fields
- Weakly-convex–concave min–max optimization: provable algorithms and applications in machine learning
- Efficient Search of First-Order Nash Equilibria in Nonconvex-Concave Smooth Min-Max Problems
- An Inertial Newton Algorithm for Deep Learning
- An Accelerated Inexact Proximal Point Method for Solving Nonconvex-Concave Min-Max Problems
- Stochastic Approximations and Differential Inclusions
- The Theory of Max-Min, with Applications
- Examples of Pathological Dynamics of the Subgradient Method for Lipschitz Path-Differentiable Functions
This page was built for publication: Conservative parametric optimality and the ridge method for tame min-max problems