A globally convergent Newton method for convex \(SC^ 1\) minimization problems
From MaRDI portal
Publication:1896575
DOI10.1007/BF02193060zbMath0831.90095OpenAlexW2082746403MaRDI QIDQ1896575
Publication date: 4 September 1995
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf02193060
Newton methodconvex minimizationnonlinear minimax problemsstochastic programs with recourseglobally convergent and locally superlinearly convergent methodsemismooth but nondifferentiable gradient
Related Items
On solutions and duality of nonlinear nonsmooth fractional programs, Smooth and Semismooth Newton Methods for Constrained Approximation and Estimation, A nonsmooth Newton method for elastoplastic problems, Constructing a sequence of discrete Hessian matrices of an \(SC^{1}\) function uniformly convergent to the generalized Hessian matrix, An SQP algorithm for extended linear-quadratic problems in stochastic programming, A preconditioning proximal Newton method for nondifferentiable convex optimization, Solution of monotone complementarity problems with locally Lipschitzian functions, A Newton's method for perturbed second-order cone programs, Newton's method for quadratic stochastic programs with recourse, Inexact Newton methods for solving nonsmooth equations, Minimization of \(SC^ 1\) functions and the Maratos effect, The semismooth-related properties of a merit function and a descent method for the nonlinear complementarity problem, Composite Difference-Max Programs for Modern Statistical Estimation Problems, Convergence analysis of Gauss-Newton methods for the complementarity problem, Local feasible QP-free algorithms for the constrained minimization of SC\(^1\) functions, On Efficiently Solving the Subproblems of a Level-Set Method for Fused Lasso Problems, A parallel inexact Newton method for stochastic programs with recourse, Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization, MultiComposite Nonconvex Optimization for Training Deep Neural Networks, A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization, A perturbed version of an inexact generalized Newton method for solving nonsmooth equations, Globally convergent inexact generalized Newton method for first-order differentiable optimization problems, Multistage quadratic stochastic programming, A globally and superlinearly convergent trust region method for \(LC^1\) optimization problems, Differentiability and semismoothness properties of integral functions and their applications, Convergence rate of Newton's method for \(L_2\) spectral estimation, An inexact SQP Newton method for convex SC\(^{1}\) minimization problems, A new trust region algorithm for nonsmooth convex minimization, Globally convergent inexact generalized Newton's methods for nonsmooth equations, An effective adaptive trust region algorithm for nonsmooth minimization, The \(SC^1\) 1property of an expected residual function arising from stochastic complementarity problems, Shape-preserving interpolation and smoothing for options market implied volatility, Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone, Globally and superlinearly convergent trust-region algorithm for convex \(SC^ 1\)-minimization problems and its application to stochastic programs, An SQP method for general nonlinear complementarity problems, Newton-type methods for stochastic programming.
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- NE/SQP: A robust algorithm for the nonlinear complementarity problem
- Generalized Hessian matrix and second-order optimality conditions for problems with \(C^{1,1}\) data
- Epigraphical analysis
- Lipschitzian inverse functions, directional derivatives, and applications in \(C^{1,1}\) optimization
- Superlinearly convergent approximate Newton methods for LC\(^ 1\) optimization problems
- A family of variable metric proximal methods
- On concepts of directional differentiability
- Newton's method for quadratic stochastic programs with recourse
- A nonsmooth version of Newton's method
- Nonsmooth Equations: Motivation and Algorithms
- Piecewise Ck functions in nonsmooth analysis
- Newton's Method for B-Differentiable Equations
- Optimization and nonsmooth analysis
- A Lagrangian finite generation technique for solving linear-quadratic problems in stochastic programming
- EXTENSION OF NEWTON AND QUASI-NEWTON METHODS TO SYSTEMS OF PC^1 EQUATIONS
- Local structure of feasible sets in nonlinear programming, Part III: Stability and sensitivity
- On second-order sufficient optimality conditions for c 1,1-optimization problems
- Minimization of Locally Lipschitzian Functions
- Entropic Proximal Mappings with Applications to Nonlinear Programming
- Superlinearly convergent variable metric algorithms for general nonlinear programming problems
- Semismooth and Semiconvex Functions in Constrained Optimization
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Piecewise Smoothness, Local Invertibility, and Parametric Analysis of Normal Maps