CUTE
From MaRDI portal
Software:26578
No author found.
Related Items (only showing first 100 items - show all)
A globally convergent penalty-free method for optimization with equality constraints and simple bounds ⋮ An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property ⋮ Efficient tridiagonal preconditioner for the matrix-free truncated Newton method ⋮ Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization ⋮ A fast convergent sequential linear equation method for inequality constrained optimization without strict complementarity ⋮ On Hager and Zhang's conjugate gradient method with guaranteed descent ⋮ New hybrid conjugate gradient method for unconstrained optimization ⋮ Spectral method and its application to the conjugate gradient method ⋮ A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem ⋮ A descent hybrid conjugate gradient method based on the memoryless BFGS update ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ Corrected sequential linear programming for sparse minimax optimization ⋮ A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints ⋮ A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method ⋮ A family of second-order methods for convex \(\ell _1\)-regularized optimization ⋮ Primal and dual active-set methods for convex quadratic programming ⋮ An active set truncated Newton method for large-scale bound constrained optimization ⋮ An inexact Newton method for nonconvex equality constrained optimization ⋮ Partial spectral projected gradient method with active-set strategy for linearly constrained optimization ⋮ Sufficient descent nonlinear conjugate gradient methods with conjugacy condition ⋮ Best practices for comparing optimization algorithms ⋮ A modified quasi-Newton method for structured optimization with partial information on the Hessian ⋮ Recent progress in unconstrained nonlinear optimization without derivatives ⋮ An improved Perry conjugate gradient method with adaptive parameter choice ⋮ Monotone projected gradient methods for large-scale box-constrained quadratic programming ⋮ On the performance of a new symmetric rank-one method with restart for solving unconstrained optimization problems ⋮ A class of one parameter conjugate gradient methods ⋮ A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization ⋮ An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems ⋮ Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search ⋮ A restoration-free filter SQP algorithm for equality constrained optimization ⋮ Spectral scaling BFGS method ⋮ A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization ⋮ A practical relative error criterion for augmented Lagrangians ⋮ The convergence of conjugate gradient method with nonmonotone line search ⋮ A simple sufficient descent method for unconstrained optimization ⋮ On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians ⋮ An adaptive trust region method based on simple conic models ⋮ Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization ⋮ A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties ⋮ A new sequential systems of linear equations algorithm of feasible descent for inequality constrained optimization ⋮ Improved Hessian approximation with modified secant equations for symmetric rank-one method ⋮ Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property ⋮ A primal-dual augmented Lagrangian ⋮ Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization ⋮ Globally convergent modified Perry's conjugate gradient method ⋮ The global convergence of a descent PRP conjugate gradient method ⋮ Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization ⋮ Global convergence of a spectral conjugate gradient method for unconstrained optimization ⋮ Global convergence of some modified PRP nonlinear conjugate gradient methods ⋮ On a two-phase approximate greatest descent method for nonlinear optimization with equality constraints ⋮ A new \(\varepsilon \)-generalized projection method of strongly sub-feasible directions for inequality constrained optimization ⋮ Interior-point methods for nonconvex nonlinear programming: cubic regularization ⋮ A symmetric rank-one method based on extra updating techniques for unconstrained optimization ⋮ A sufficient descent LS conjugate gradient method for unconstrained optimization problems ⋮ A working set SQCQP algorithm with simple nonmonotone penalty parameters ⋮ Augmented Lagrangian applied to convex quadratic problems ⋮ Global and local convergence of a nonmonotone SQP method for constrained nonlinear optimization ⋮ Global convergence of a nonmonotone trust region algorithm with memory for unconstrained optimization ⋮ Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search ⋮ New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction ⋮ Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization ⋮ Simulated annealing with asymptotic convergence for nonlinear constrained optimization ⋮ A limited memory descent Perry conjugate gradient method ⋮ Dai-Kou type conjugate gradient methods with a line search only using gradient ⋮ A modified three-term PRP conjugate gradient algorithm for optimization models ⋮ An improved strongly sub-feasible SSLE method for optimization problems and numerical experiments ⋮ A conjugate gradient method for unconstrained optimization problems ⋮ Global and local convergence of a class of penalty-free-type methods for nonlinear programming ⋮ A nonmonotone filter method for nonlinear optimization ⋮ Modified nonmonotone Armijo line search for descent method ⋮ A modified CG-DESCENT method for unconstrained optimization ⋮ A robust implementation of a sequential quadratic programming algorithm with successive error restoration ⋮ Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization ⋮ A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization ⋮ A feasible QP-free algorithm combining the interior-point method with active set for constrained optimization ⋮ Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints ⋮ New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization ⋮ Scaled memoryless symmetric rank one method for large-scale optimization ⋮ Some three-term conjugate gradient methods with the inexact line search condition ⋮ A trajectory-based method for constrained nonlinear optimization problems ⋮ A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems ⋮ A superlinearly convergent strongly sub-feasible SSLE-type algorithm with working set for nonlinearly constrained optimization ⋮ Global convergence of quasi-Newton methods based on adjoint Broyden updates ⋮ Two modified Dai-Yuan nonlinear conjugate gradient methods ⋮ A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization ⋮ A conjugate gradient method with sufficient descent property ⋮ Numerical expirience with a class of self-scaling quasi-Newton algorithms ⋮ A class of collinear scaling algorithms for bound-constrained optimization: Derivation and computational results ⋮ Hybrid conjugate gradient algorithm for unconstrained optimization ⋮ Acceleration of conjugate gradient algorithms for unconstrained optimization ⋮ A conic trust-region method and its convergence properties ⋮ An implementation of Shor's \(r\)-algorithm ⋮ Mixed integer nonlinear programming tools: a practical overview ⋮ Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization ⋮ A truncated descent HS conjugate gradient method and its global convergence ⋮ An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation ⋮ A framework for globally convergent algorithms using gradient bounding functions ⋮ A primal-dual interior-point method capable of rapidly detecting infeasibility for nonlinear programs ⋮ Advances in design and implementation of optimization software
This page was built for software: CUTE