Strong KKT conditions and weak sharp solutions in convex-composite optimization
From MaRDI portal
(Redirected from Publication:623359)
Recommendations
- Weak sharp minima revisited. I: Basic theory
- KKT conditions for weak\(^\ast\) compact convex sets, theorems of the alternative, and optimality conditions
- Approximate optimality conditions for composite convex optimization problems
- Variational analysis of composite models with applications to continuous optimization
- Weak sharp minima revisited. II: Application to linear regularity and error bounds
Cites work
- scientific article; zbMATH DE number 417962 (Why is no real title available?)
- scientific article; zbMATH DE number 1328979 (Why is no real title available?)
- scientific article; zbMATH DE number 3399886 (Why is no real title available?)
- Abadie's Constraint Qualification, Metric Regularity, and Error Bounds for Differentiable Convex Inequalities
- An Extension of the Karush–Kuhn–Tucker Necessity Conditions to Infinite Programming
- Characterizations of Local and Global Error Bounds for Convex Inequalities in Banach Spaces
- Characterizations of error bounds for lower semicontinuous functions on metric spaces
- Characterizations of the Strong Basic Constraint Qualifications
- Convex composite non-Lipschitz programming
- First- and Second-Order Epi-Differentiability in Nonlinear Programming
- Generalized Directional Derivatives and Subgradients of Nonconvex Functions
- Lagrange Multipliers and Optimality
- Linear Regularity for a Collection of Subsmooth Sets in Banach Spaces
- Local properties of algorithms for minimizing nonsmooth composite functions
- Majorizing Functions and Convergence of the Gauss–Newton Method for Convex Composite Optimization
- Metric Regularity and Constraint Qualifications for Convex Inequalities on Banach Spaces
- Metric Subregularity and Constraint Qualifications for Convex Generalized Equations in Banach Spaces
- Metric regularity and subdifferential calculus
- On convergence of the Gauss-Newton method for convex composite optimization.
- Optimality conditions in mathematical programming and composite optimization
- Optimization and nonsmooth analysis
- Weak Sharp Minima in Mathematical Programming
- Weak Sharp Minima: Characterizations and Sufficient Conditions
- Weak sharp minima revisited. I: Basic theory
- Weak sharp minima revisited. III: Error bounds for differentiable convex inclusions
Cited in
(15)- Variational analysis of composite models with applications to continuous optimization
- Characterizing robust weak sharp solution sets of convex optimization problems with uncertainty
- Necessary conditions for weak sharp minima in cone-constrained optimization problems
- Optimality conditions for robust weak sharp efficient solutions of nonsmooth uncertain multiobjective optimization problems
- Isolated and proper efficiencies in semi-infinite vector optimization problems
- Riemannian linearized proximal algorithms for nonnegative inverse eigenvalue problem
- Uniform subsmoothness and linear regularity for a collection of infinitely many closed sets
- On convergence rates of linearized proximal algorithms for convex composite optimization with applications
- Generalized weak sharp minima in cone-constrained convex optimization on Hadamard manifolds
- Strong KKT, Second Order Conditions and Non-solid Cones in Vector Optimization
- Strong Fermat rules for constrained set-valued optimization problems on Banach spaces
- Generalized weak sharp minima in cone-constrained convex optimization with applications
- Weak Sharp Minima for Convex Infinite Optimization Problems in Normed Linear Spaces
- Linearized proximal algorithms with adaptive stepsizes for convex composite optimization with applications
- A sufficient minimality condition for convex composite functions
This page was built for publication: Strong KKT conditions and weak sharp solutions in convex-composite optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q623359)