On finite convergence and constraint identification of subgradient projection methods
DOI10.1007/BF01581092zbMATH Open0779.49019OpenAlexW1997191791MaRDI QIDQ1802955FDOQ1802955
Authors: Sjur Didrik Flåm
Publication date: 29 June 1993
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf01581092
Recommendations
differential inclusionscontinuous timefinite convergenceconstraint qualificationstrict complementarityLyapunov's methodsubgradient projection algorithmconstraint identification
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Projected gradient methods for linearly constrained problems
- Convex programming in Hilbert space
- On the Identification of Active Constraints
- Title not available (Why is that?)
- On the Goldstein-Levitin-Polyak gradient projection method
- Global and Asymptotic Convergence Rate Estimates for a Class of Projected Gradient Processes
- Continuous algorithms for solution of convex optimization problems and finding saddle points of contex-coneave functions with the use of projection operations
- The Gradient Projection Method under Mild Differentiability Conditions
- Generalized Kuhn–Tucker Conditions for Mathematical Programming Problems in a Banach Space
- A Necessary and Sufficient Qualification for Constrained Optimization
- A Continuous Approach to Oligopolistic Market Equilibrium
- Approximating saddle points as equilibria of differential inclusions
- The Gradient Projection Method Using Curry’s Steplength
- Stability of continuous subgradient algorithms
- Metric Projections and the Gradient Projection Method in Banach Spaces
- Openness of the metric projection in certain Banach spaces
- Application of the method of Lyapunov functions to the study of the convergence of numerical methods
- The method of lyapunov functions in the study of continuous algorithms of mathematical programming
Cited In (10)
- Title not available (Why is that?)
- Active‐Set Newton Methods and Partial Smoothness
- Proximal methods avoid active strict saddles of weakly convex functions
- The chain rule for VU-decompositions of nonsmooth functions
- Generic minimizing behavior in semialgebraic optimization
- Title not available (Why is that?)
- Optimality, identifiability, and sensitivity
- Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming
- FINDING NORMALIZED EQUILIBRIUM IN CONVEX-CONCAVE GAMES
- Partial Smoothness and Constant Rank
This page was built for publication: On finite convergence and constraint identification of subgradient projection methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1802955)