Generalized Hessian matrix and second-order optimality conditions for problems with \(C^{1,1}\) data

From MaRDI portal
Publication:795323


DOI10.1007/BF01442169zbMath0542.49011MaRDI QIDQ795323

Jean-Baptiste Hiriart-Urruty, Jean Jacques Strodiot, Van Hien Nguyen

Publication date: 1984

Published in: Applied Mathematics and Optimization (Search for Journal in Brave)


90C30: Nonlinear programming

49M37: Numerical methods based on nonlinear programming

58C20: Differentiation theory (Gateaux, Fréchet, etc.) on manifolds

26B05: Continuity and differentiation questions

49K10: Optimality conditions for free problems in two or more independent variables


Related Items

Second-order conditions in c1, 1optimization with applications, Generalized second-order derivatives and optimality conditions, Second order approximations and dual necessary optimality conditions, A finite newton method for classification, Minimal approximate Hessians for continuously Gâteaux differentiable functions, Local properties of solutions of nonsmooth variational solutions of nonsmooth variational inequalities, Invexity criteria for a class of vector-valued functions, On second-order directional derivatives, Generalised hessian, max function and weak convexity, Second-order necessary optimality conditions via directional regularity, Limiting subhessians, limiting subjets and their calculus, A globally and superlinearly convergent trust region method for \(LC^1\) optimization problems, Optimality conditions for \(C^{1,1}\) vector optimization problems, Saddlepoint Problems in Nondifferentiable Programming, Multicategory proximal support vector machine classifiers, Multicategory proximal support vector machine classifiers, Local feasible QP-free algorithms for the constrained minimization of SC\(^1\) functions, Second-order optimality conditions for nondominated solutions of multiobjective programming with \(C^{1,1}\) data, A Newton method for linear programming, Second-order conditions for efficiency in nonsmooth multiobjective optimization problems, Generalized derivatives and nonsmooth optimization, a finite dimensional tour (with comments and rejoinder), A set-valued analysis approach to second order differentiation of nonsmooth functions, An ODE-based trust region method for unconstrained optimization problems, On second-order Fritz John type optimality conditions in nonsmooth multiobjective programming, First and second order optimality conditions using approximations for nonsmooth vector optimization in Banach spaces, Parametric method for global optimization, Some applications of variational inequalities in nonsmooth analysis, Second-order necessary optimality conditions for optimization problems involving set-valued maps, A new second order optimality conditions for the extremal problem under inclusion constraints, Differentiability properties of functions that are \(\ell \)-stable at a point, From scalar to vector optimization., Parametric proximal-point methods, First and second-order approximations as derivatives of mappings in optimality conditions for nonsmooth vector optimization, Fréchet approach in second-order optimization, A note on second-order optimality conditions, Characterization of strict convexity for locally Lipschitz functions, The second order optimality conditions for nonlinear mathematical programming with \(C^{1,1}\) data, Second-order global optimality conditions for convex composite optimization, Newton's method and quasi-Newton-SQP method for general \(\text{LC}^1\) constrained optimization, Lipschitzian inverse functions, directional derivatives, and applications in \(C^{1,1}\) optimization, Descent algorithm for a class of convex nondifferentiable functions, Limiting behavior of the approximate second-order subdifferential of a convex function, Characterizations of strict local minima and necessary conditions for weak sharp minima, Metric regularity and second-order necessary optimality conditions for minimization problems under inclusion constraints, Superlinearly convergent approximate Newton methods for LC\(^ 1\) optimization problems, Generalized Hessian for \(C^{1,1}\) functions in infinite dimensional normed spaces, Second order optimality conditions for the extremal problem under inclusion constraints, Optimality conditions for \(C^{1,1}\) constrained multiobjective problems, Support functions of the Clarke generalized Jacobian and of its plenary hull, Second-order global optimality conditions for optimization problems, Second-order optimality conditions for the extremal problem under inclusion constraints, A globally convergent Newton method for convex \(SC^ 1\) minimization problems, Minimization of \(SC^ 1\) functions and the Maratos effect, Convex composite minimization with \(C^{1,1}\) functions, Second-order subdifferentials of \(C^{1,1}\) functions and optimality conditions, Massive data classification via unconstrained support vector machines, Breast tumor susceptibility to chemotherapy via support vector machines, On second-order conditions in unconstrained optimization, Stability in generalized differentiability based on a set convergence principle, Second-order mollified derivatives and optimization, Second-order conditions in \(C^{1,1}\) constrained vector optimization, Pseudo-Hessian and Taylor's expansion for vector-valued functions, Exact penalty functions and Lagrange multipliers, Generalized Second Derivatives of Convex Functions and Saddle Functions, Stability of inclusions: characterizations via suitable Lipschitz functions and algorithms, Unnamed Item, Recursive Finite Newton Algorithm for Support Vector Regression in the Primal, Chunking for massive nonlinear kernel classification, A filter-trust-region method for LC 1 unconstrained optimization and its global convergence, On second-order sufficient optimality conditions for c 1,1-optimization problems, On Pseudo-Differentiability, On relations and applications of generalized second-order directional derivatives, Approximate generalized Hessians and Taylor’s expansions for continuously Gâteaux differentiable functions



Cites Work