Sufficient Conditions for Instability of the Subgradient Method with Constant Step Size
From MaRDI portal
Publication:6136655
DOI10.1137/22M1535723arXiv2211.14852OpenAlexW4390586700MaRDI QIDQ6136655FDOQ6136655
Authors: Cédric Josz, Lexiao Lai
Publication date: 17 January 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Abstract: We provide sufficient conditions for instability of the subgradient method with constant step size around a local minimum of a locally Lipschitz semi-algebraic function. They are satisfied by several spurious local minima arising in robust principal component analysis and neural networks.
Full work available at URL: https://arxiv.org/abs/2211.14852
Cites Work
- Variational Analysis
- Robust principal component analysis?
- Title not available (Why is that?)
- Clarke Subgradients of Stratifiable Functions
- Local differentiability of distance functions
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
- Title not available (Why is that?)
- Tangents to an analytic variety
- Characterization of metric regularity of subdifferentials
- Hölder metric subregularity with applications to proximal point method
- Nonconvergence to unstable points in urn models and stochastic approximations
- Metric Subregularity of Multifunctions: First and Second Order Infinitesimal Characterizations
- Higher-order metric subregularity and its applications
- Regularity and conditioning of solution mappings in variational analysis
- Stratifications de Whitney et théorème de Bertini-Sard
- Verdier and strict Thom stratifications in o-minimal structures
- (In-)Stability of Differential Inclusions
- On the Complexity of Robust PCA and ℓ1-Norm Low-Rank Matrix Approximation
- Hölder Stable Minimizers, Tilt Stability, and Hölder metric Regularity of Subdifferentials
- Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
- An Introduction to Optimization on Smooth Manifolds
- Lyapunov stability of the subgradient method with constant step size
This page was built for publication: Sufficient Conditions for Instability of the Subgradient Method with Constant Step Size
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136655)