Globally and Superlinearly Convergent Algorithm for Minimizing a Normal Merit Function
DOI10.1137/S0363012996310245zbMATH Open0916.65051OpenAlexW2061555414MaRDI QIDQ4388945FDOQ4388945
Publication date: 10 May 1998
Published in: SIAM Journal on Control and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/s0363012996310245
Recommendations
- A note on a globally convergent Newton method for solving monotone variational inequalities
- Global and superlinear convergence of the smoothing Newton method and its application to general box constrained variational inequalities
- Globally Convergent Newton Methods for Nonsmooth Equations
- Merit functions for nonsmooth complementarity problems and related descent algorithms
- scientific article; zbMATH DE number 1921315
global convergencevariational inequalitiessuperlinear convergencegeneralized Newton methodfirst-order algorithmsystems of nonsmooth equationsnormal merit function
Numerical optimization and variational techniques (65K10) Numerical computation of solutions to systems of equations (65H10) Variational inequalities (49J40) Newton-type methods (49M15)
Cited In (4)
- A sequential quadratically constrained quadratic programming method for unconstrained minimax problems
- Quadratically constraint quadratical algorithm model for nonlinear minimax problems
- A merit function approach to the subgradient method with averaging
- A quadratically approximate framework for constrained optimization, global and local convergence
This page was built for publication: Globally and Superlinearly Convergent Algorithm for Minimizing a Normal Merit Function
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4388945)