Perturbed Kuhn-Tucker points and rates of convergence for a class of nonlinear-programming algorithms
From MaRDI portal
Publication:4047415
DOI10.1007/BF01585500zbMath0294.90078MaRDI QIDQ4047415
Publication date: 1974
Published in: Mathematical Programming (Search for Journal in Brave)
Numerical mathematical programming methods (65K05) Convex programming (90C25) Sensitivity, stability, well-posedness (49K40) Nonlinear programming (90C30)
Related Items (75)
Lipschitz properties of solutions in mathematical programming ⋮ Superlinearly convergent approximate Newton methods for LC\(^ 1\) optimization problems ⋮ Second order sensitivity analysis and asymptotic theory of parametrized nonlinear programs ⋮ Newton's method for a class of nonsmooth functions ⋮ Sensitivity analysis in economics ⋮ On combining feasibility, descent and superlinear convergence in inequality constrained optimization ⋮ Fixed-time control under spatiotemporal and input constraints: a quadratic programming based approach ⋮ Sequential quadratic programming algorithm for discrete optimal control problems with control inequality constraints ⋮ A projected Newton method in a Cartesian product of balls ⋮ The Lagrange-Newton method for state constrained optimal control problems ⋮ A competitive inexact nonmonotone filter SQP method: convergence analysis and numerical results ⋮ Quadratically and superlinearly convergent algorithms for the solution of inequality constrained minimization problems ⋮ Sensitivity and stability analysis for nonlinear programming ⋮ A sequential quadratic programming method for potentially infeasible mathematical programs ⋮ Convergence of algorithms for perturbed optimization problems ⋮ Computable bounds on parametric solutions of convex problems ⋮ The lagrange-newton method for infinite-dimensional optimization problems ⋮ A Projected Gradient and Constraint Linearization Method for Nonlinear Model Predictive Control ⋮ A globally convergent version of a general recursive algorithm for nonlinear programming ⋮ Parametric sensitivity analysis of perturbed PDE optimal control problems with state and control constraints ⋮ An SQP method for optimal control of weakly singular Hammerstein integral equations ⋮ Max-min resource allocation ⋮ Exponential Decay of Sensitivity in Graph-Structured Nonlinear Programs ⋮ A parallel inexact Newton method for stochastic programs with recourse ⋮ Newton's method and its use in optimization ⋮ Substitution secant/finite difference method to large sparse minimax problems ⋮ A superlinearly convergent numerical algorithm for nonlinear programming ⋮ A robust secant method for optimization problems with inequality constraints ⋮ A special newton-type optimization method ⋮ Inexact Newton-Type Optimization with Iterated Sensitivities ⋮ Enlarging the region of convergence of Newton's method for constrained optimization ⋮ Superlinear convergence of the sequential quadratic method in constrained optimization ⋮ Stochastic programming with incomplete information:a surrey of results on postoptimization and sensitivity analysis ⋮ A Sparsity Preserving Convexification Procedure for Indefinite Quadratic Programs Arising in Direct Optimal Control ⋮ Direct Optimal Control and Model Predictive Control ⋮ A new SQP method of feasible directions for nonlinear programming. ⋮ On the global and superlinear convergence of a discretized version of Wilson's method ⋮ A recursive quadratic programming algorithm for semi-infinite optimization problems ⋮ Sequential quadratic programming methods for parametric nonlinear optimization ⋮ Estimates for Kuhn-Tucker points of perturbed convex programs ⋮ A norm-relaxed method of feasible directions for finely discretized problems from semi-infinite programming ⋮ Solving Stackelberg equilibrium for multi objective aerodynamic shape optimization ⋮ Global and local convergence of a class of penalty-free-type methods for nonlinear programming ⋮ A superlinearly convergent method of feasible directions. ⋮ Newton-type methods: a broader view ⋮ Sensitivity analysis for nonlinear programming using penalty methods ⋮ Inexact Josephy-Newton framework for generalized equations and its applications to local analysis of Newtonian methods for constrained optimization ⋮ Superlinearly convergent quasi-newton algorithms for nonlinearly constrained optimization problems ⋮ Generalized implicit function theorem and its application to parametric optimal control problems ⋮ Superlinearly convergent variable metric algorithms for general nonlinear programming problems ⋮ Global and local convergence of a filter line search method for nonlinear programming ⋮ A globally convergent method for nonlinear programming ⋮ Stability and sensitivity-analysis for stochastic programming ⋮ Exact penalty function algorithm with simple updating of the penalty parameter ⋮ Superlinearly convergent algorithm for min-max problems ⋮ A second-order method for the general nonlinear programming problem ⋮ The Lagrange-Newton method for nonlinear optimal control problems ⋮ Global convergence of an SQP method without boundedness assumptions on any of the iterative sequences ⋮ Strong Metric (Sub)regularity of Karush–Kuhn–Tucker Mappings for Piecewise Linear-Quadratic Convex-Composite Optimization and the Quadratic Convergence of Newton’s Method ⋮ On the differentiability of optimal values for bounded nonlinear programs with equality constraints ⋮ Inexact Newton methods for the nonlinear complementarity problem ⋮ A robust sequential quadratic programming method ⋮ The linearization method ⋮ Perturbation analysis of nonlinear semidefinite programming under Jacobian uniqueness conditions ⋮ A globally and superlinearly convergent feasible QP-free method for nonlinear programming ⋮ Newton's method for singular constrained optimization problems ⋮ Comments on: Critical Lagrange multipliers: what we currently know about them, how they spoil our lives, and what we can do about it ⋮ Rejoinder on: Critical Lagrange multipliers: what we currently know about them, how they spoil our lives, and what we can do about it ⋮ Applications of the method of partial inverses to convex programming: Decomposition ⋮ Rates of convergence for adaptive Newton methods ⋮ Degeneracy in NLP and the development of results motivated by its presence ⋮ Some inverse mapping theorems ⋮ An iterative working-set method for large-scale nonconvex quadratic programming ⋮ A globally convergent algorithm for nonlinearly constrained optimization problems ⋮ Primal superlinear convergence of SQP methods in piecewise linear-quadratic composite optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A sufficient condition for continuity of optimal sets in mathematical programming
- On the continuity of the minimum set of a continuous function
- Sensitivity analysis for nonlinear programming using penalty methods
- Computational experience in sensitivity analysis for nonlinear programming
- The Validity of a Family of Optimization Methods
- Recursive Decision Systems: An Existence Analysis
- Iterative Solution of Nonlinear Optimal Control Problems
- Stability in Nonlinear Programming
- Penalty function versus non-penalty function methods for constrained nonlinear programming problems
- A quadratically-convergent algorithm for general nonlinear programming problems
This page was built for publication: Perturbed Kuhn-Tucker points and rates of convergence for a class of nonlinear-programming algorithms