Sufficient Conditions for Instability of the Subgradient Method with Constant Step Size
From MaRDI portal
Publication:6136655
DOI10.1137/22M1535723arXiv2211.14852OpenAlexW4390586700MaRDI QIDQ6136655FDOQ6136655
Authors: Cédric Josz, Lexiao Lai
Publication date: 17 January 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Abstract: We provide sufficient conditions for instability of the subgradient method with constant step size around a local minimum of a locally Lipschitz semi-algebraic function. They are satisfied by several spurious local minima arising in robust principal component analysis and neural networks.
Full work available at URL: https://arxiv.org/abs/2211.14852
Cites Work
- Variational Analysis
- Robust principal component analysis?
- Title not available (Why is that?)
- Clarke Subgradients of Stratifiable Functions
- Local differentiability of distance functions
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
- Title not available (Why is that?)
- Tangents to an analytic variety
- Characterization of metric regularity of subdifferentials
- Hölder metric subregularity with applications to proximal point method
- Nonconvergence to unstable points in urn models and stochastic approximations
- Metric subregularity of multifunctions: first and second order infinitesimal characterizations
- Higher-order metric subregularity and its applications
- Regularity and conditioning of solution mappings in variational analysis
- Stratifications de Whitney et théorème de Bertini-Sard
- Verdier and strict Thom stratifications in o-minimal structures
- (In-)stability of differential inclusions. Notions, equivalences, and Lyapunov-like characterizations
- On the complexity of robust PCA and \(\ell_1\)-norm low-rank matrix approximation
- Hölder stable minimizers, tilt stability, and Hölder metric regularity of subdifferentials
- Gradient descent only converges to minimizers: non-isolated critical points and invariant regions
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
- An Introduction to Optimization on Smooth Manifolds
- Lyapunov stability of the subgradient method with constant step size
This page was built for publication: Sufficient Conditions for Instability of the Subgradient Method with Constant Step Size
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136655)