CUTE
From MaRDI portal
Software:26578
swMATH14681MaRDI QIDQ26578FDOQ26578
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- A globally convergent penalty-free method for optimization with equality constraints and simple bounds
- Spectral method and its application to the conjugate gradient method
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- Nonmonotone adaptive trust region method with line search based on new diagonal updating
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
- Efficient tridiagonal preconditioner for the matrix-free truncated Newton method
- Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization
- A fast convergent sequential linear equation method for inequality constrained optimization without strict complementarity
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- The convergence of conjugate gradient method with nonmonotone line search
- Benchmarking nonlinear optimization software in technical computing environments
- Global and local convergence of a class of penalty-free-type methods for nonlinear programming
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- An inexact Newton method for nonconvex equality constrained optimization
- An improved strongly sub-feasible SSLE method for optimization problems and numerical experiments
- New hybrid conjugate gradient method for unconstrained optimization
- A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem
- Spectral scaling BFGS method
- A new sequential systems of linear equations algorithm of feasible descent for inequality constrained optimization
- Improved Hessian approximation with modified secant equations for symmetric rank-one method
- A spectral conjugate gradient method for solving large-scale unconstrained optimization
- Scaled memoryless symmetric rank one method for large-scale optimization
- Globally convergent modified Perry's conjugate gradient method
- A dwindling filter line search method for unconstrained optimization
- Algorithm 851
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A conic trust-region method and its convergence properties
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- A primal-dual augmented Lagrangian
- Two descent hybrid conjugate gradient methods for optimization
- A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints
- Scaled conjugate gradient algorithms for unconstrained optimization
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- Augmented Lagrangian applied to convex quadratic problems
- A conjugate gradient method for unconstrained optimization problems
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Improving solver success in reaching feasibility for sets of nonlinear constraints
- Hybrid conjugate gradient algorithm for unconstrained optimization
- Convergence of nonmonotone line search method
- Mathematical programming models and algorithms for engineering design optimization
- A numerical study of limited memory BFGS methods
- A nonmonotone filter method for nonlinear optimization
- Primal and dual active-set methods for convex quadratic programming
- An active set truncated Newton method for large-scale bound constrained optimization
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Augmented Lagrangian methods under the constant positive linear dependence constraint qualification
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- A repository of convex quadratic programming problems
- A new family of penalties for augmented Lagrangian methods
- Algorithm 943
- A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- On the performance of a new symmetric rank-one method with restart for solving unconstrained optimization problems
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Inertia-controlling factorizations for optimization algorithms
- An interior algorithm for nonlinear optimization that combines line search and trust region steps
- A new method of moving asymptotes for large-scale unconstrained optimization
- Matching-based preprocessing algorithms to the solution of saddle-point problems in large-scale nonconvex interior-point optimization
- Recent progress in unconstrained nonlinear optimization without derivatives
- An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems
- A restoration-free filter SQP algorithm for equality constrained optimization
- A symmetric rank-one method based on extra updating techniques for unconstrained optimization
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- An implementation of Shor's \(r\)-algorithm
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Nonmonotone strategy for minimization of quadratics with simple constraints.
- A practical relative error criterion for augmented Lagrangians
- Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search
- Numerical experiments with the Lancelot package (Release \(A\)) for large-scale nonlinear optimization
- Sufficient descent nonlinear conjugate gradient methods with conjugacy condition
- Monotone projected gradient methods for large-scale box-constrained quadratic programming
- New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization
- Mixed integer nonlinear programming tools: an updated practical overview
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
- An adaptive trust region method based on simple conic models
- A note on the implementation of an interior-point algorithm for nonlinear optimization with inexact step computations
- On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians
- Mixed integer nonlinear programming tools: a practical overview
- Modified nonmonotone Armijo line search for descent method
- Simple sequential quadratically constrained quadratic programming feasible algorithm with active identification sets for constrained minimax problems
- Global convergence of quasi-Newton methods based on adjoint Broyden updates
- A globally and superlinearly convergent primal-dual interior point trust region method for large scale constrained optimization
- A modified nonlinear conjugate gradient method with the Armijo line search and its application
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique
- Simulated annealing with asymptotic convergence for nonlinear constrained optimization
- A penalty-interior-point algorithm for nonlinear constrained optimization
- Two modified Dai-Yuan nonlinear conjugate gradient methods
- Descentwise inexact proximal algorithms for smooth optimization
- On a two-phase approximate greatest descent method for nonlinear optimization with equality constraints
- Some three-term conjugate gradient methods with the inexact line search condition
This page was built for software: CUTE