Generalized weak sharp minima in cone-constrained convex optimization on Hadamard manifolds
DOI10.1080/02331934.2014.883514zbMATH Open1337.49019OpenAlexW2054583071MaRDI QIDQ2808303FDOQ2808303
Authors: Xiaobo Li, Nan-Jing Huang
Publication date: 23 May 2016
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331934.2014.883514
Recommendations
- Generalized weak sharp minima in cone-constrained convex optimization with applications
- Weak sharp minima on Riemannian manifolds
- Nonconvex weak sharp minima on Riemannian manifolds
- Necessary conditions for weak sharp minima in cone-constrained optimization problems
- Existence of solutions for vector optimization on Hadamard manifolds
Hadamard manifoldcone-constrained convex programminggeneralized bounded weak sharp minimageneralized global weak sharp minimageneralized local weak sharp minima
Programming in abstract spaces (90C48) Fréchet and Gateaux differentiability in optimization (49J50)
Cites Work
- The Geometry of Algorithms with Orthogonality Constraints
- Newton's method on Riemannian manifolds and a geometric model for the human spine
- Nonsmooth analysis on smooth manifolds
- On the metric projection onto convex sets in riemannian spaces
- Nonsmooth analysis and Hamilton--Jacobi equations on Riemannian manifolds
- Weak Sharp Minima in Mathematical Programming
- Weak sharp minima on Riemannian manifolds
- Monotone vector fields and the proximal point algorithm on Hadamard manifolds
- On the Identification of Active Constraints
- Weak Sharp Solutions of Variational Inequalities
- Proximal Point Algorithm On Riemannian Manifolds
- Weak Sharp Solutions of Variational Inequalities in Hilbert Spaces
- Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds
- Newton's method on Riemannian manifolds: Smale's point estimate theory under the γ-condition
- Weak sharp solutions for variational inequalities in Banach spaces
- On convergence of the Gauss-Newton method for convex composite optimization.
- Newton's method on Riemannian manifolds: covariant alpha theory
- Hoffman's Error Bound, Local Controllability, and Sensitivity Analysis
- Weak sharp minima revisited. II: Application to linear regularity and error bounds
- Weak sharp minima revisited. III: Error bounds for differentiable convex inclusions
- Weak Sharp Minima for Semi-infinite Optimization Problems with Applications
- A subdifferential condition for calmness of multifunctions
- Weak sharp minima for piecewise linear multiobjective optimization in normed spaces
- Kantorovich's theorem on Newton's method in Riemannian manifolds
- Strong uniqueness in sequential linear programming
- Iterative linear programming solution of convex programs
- Strong KKT conditions and weak sharp solutions in convex-composite optimization
- Quasi-tangent vectors in flow-invariance and optimization problems on Banach manifolds
- Optimizing matrix stability
- Characterizations of weak sharp minima for lower-\(C^1\) functions
- Generalized weak sharp minima in cone-constrained convex optimization with applications
Cited In (8)
- The concept of admissible sets in optimization problems on non-Hadamard Riemannian manifolds
- Weak sharpness and finite termination for variational inequalities on Hadamard manifolds
- Weak sharp minima on Riemannian manifolds
- Generalized weak sharp minima in cone-constrained convex optimization with applications
- Generalized weak sharp minima and the existence of strong Lagrangian multipliers in conic convex optimization
- Variational inequalities governed by strongly pseudomonotone vector fields on Hadamard manifolds
- Subdifferentials of perturbed distance function in Riemannian manifolds
- Nonconvex weak sharp minima on Riemannian manifolds
This page was built for publication: Generalized weak sharp minima in cone-constrained convex optimization on Hadamard manifolds
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2808303)