Finite convergence of algorithms for nonlinear programs and variational inequalities
From MaRDI portal
Publication:809897
DOI10.1007/BF00940629zbMath0732.90076OpenAlexW2010413631MaRDI QIDQ809897
Jerzy Kyparisis, Faiz A. Al-Khayyal
Publication date: 1991
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00940629
Nonlinear programming (90C30) Variational inequalities (49J40) Computational methods for problems pertaining to operations research and mathematical programming (90-08)
Related Items
A proximal method for identifying active manifolds ⋮ Proximal methods avoid active strict saddles of weakly convex functions ⋮ Partial Smoothness and Constant Rank ⋮ Computing proximal points of convex functions with inexact subgradients ⋮ Relax-and-split method for nonconvex inverse problems ⋮ Optimality, identifiability, and sensitivity ⋮ Accelerating convergence of cutting plane algorithms for disjoint bilinear programming ⋮ Variational analysis on local sharp minima via exact penalization ⋮ Minimum principle sufficiency ⋮ Global convergence and finite termination of a class of smooth penalty function algorithms ⋮ Derivative-free optimization methods for finite minimax problems ⋮ Generic Minimizing Behavior in Semialgebraic Optimization ⋮ Active‐Set Newton Methods and Partial Smoothness ⋮ Geometrical interpretation of the predictor-corrector type algorithms in structured optimization problems ⋮ On finite convergence of proximal point algorithms for variational inequalities ⋮ Note on solving linear complementarity problems as jointly constrained bilinear programs ⋮ First-order conditions for isolated locally optimal solutions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sensitivity analysis for variational inequalities
- Note on solving linear complementarity problems as jointly constrained bilinear programs
- A simple characterization of solutions sets of convex programs
- Minimum principle sufficiency
- Foundations of optimization
- Solution of symmetric linear complementarity problems by iterative methods
- On the convergence of projected gradient processes to singular critical points
- Two-Metric Projection Methods for Constrained Optimization
- Jointly Constrained Biconvex Programming
- Projected gradient methods for linearly constrained problems
- On the Identification of Active Constraints
- Convergence of SQP-Like Methods for Constrained Optimization
- Newton’s Method and the Goldstein Step-Length Rule for Constrained Minimization Problems
- Convergence Rates for Conditional Gradient Sequences Generated by Implicit Step Length Rules
- Global and Asymptotic Convergence Rate Estimates for a Class of Projected Gradient Processes
- Generalized equations and their solutions, part II: Applications to nonlinear programming
- Iterative methods for variational and complementarity problems
- On the Goldstein-Levitin-Polyak gradient projection method
- Rates of Convergence for Conditional Gradient Algorithms Near Singular and Nonsingular Extremals
- An iterative scheme for variational inequalities
- Projected Newton Methods for Optimization Problems with Simple Constraints
- Convex programming in Hilbert space
- Convex Analysis
This page was built for publication: Finite convergence of algorithms for nonlinear programs and variational inequalities