Global convergence of the gradient method for functions definable in o-minimal structures
From MaRDI portal
Publication:6052062
DOI10.1007/s10107-023-01937-5arXiv2303.03534OpenAlexW4321595774MaRDI QIDQ6052062
Publication date: 23 October 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2303.03534
Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Model theory of ordered structures; o-minimality (03C64)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Łojasiewicz inequalities in o-minimal structures
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Lectures on convex optimization
- Quadratic growth and critical point stability of semi-algebraic functions
- Maximum length of steepest descent curves for quasi-convex functions
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- Nonconvergence to unstable points in urn models and stochastic approximations
- Asymptotic convergence of nonlinear contraction semigroups in Hilbert space
- Un exemple concernant le comportement asymptotique de la solution du problème \(du/dt+\partial\varphi(\mu)\ni=0\)
- On gradients of functions definable in o-minimal structures
- Expansions of the real field with power functions
- The elementary theory of restricted analytic fields with exponentiation
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Geometric categories and o-minimal structures
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Cauchy and the gradient method
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
- Stochastic subgradient method converges on tame functions
- Rectifiability of self-contracted curves in the Euclidean space and applications
- First-order methods almost always avoid strict saddle points
- Projections of semi-analytic sets
- Minimization of functions having Lipschitz continuous first partial derivatives
- A new decision method for elementary algebra
- Curves of Descent
- Variational Analysis in Sobolev and BV Spaces
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Ubiquitous algorithms in convex optimization generate self-contracted sequences
- Clarke Subgradients of Stratifiable Functions
- Decoding by Linear Programming
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- A generalization of the Tarski-Seidenberg theorem, and some nondefinability results
- Definable Sets in Ordered Structures. I
- The real field with convergent generalized power series
- Quasianalytic Denjoy-Carleman classes and o-minimality
- First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems
- Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions
- Bounding the length of gradient trajectories
- Learning deep linear neural networks: Riemannian gradient flows and convergence to global minimizers
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
- Convergence Conditions for Ascent Methods
- On Steepest Descent
- Model completeness results for expansions of the ordered field of real numbers by restricted Pfaffian functions and the exponential function
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- The general theory of relaxation methods applied to linear systems
- The method of steepest descent for non-linear minimization problems
- The measure of the critical values of differentiable maps
- Analysis II
- Proof of the gradient conjecture of R. Thom.
- A geometric approach of gradient descent algorithms in linear neural networks
- Certifying the Absence of Spurious Local Minima at Infinity