CUTE
From MaRDI portal
Cited in
(only showing first 100 items - show all)- Nonmonotone curvilinear line search methods for unconstrained optimization
- A new \(\varepsilon \)-generalized projection method of strongly sub-feasible directions for inequality constrained optimization
- Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization
- Simulated annealing with asymptotic convergence for nonlinear constrained optimization
- A truncated descent HS conjugate gradient method and its global convergence
- Nonconvex optimization using negative curvature within a modified linesearch
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- BOX-QUACAN
- Globally convergence of nonlinear conjugate gradient method for unconstrained optimization
- Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
- A conjugate directions approach to improve the limited-memory BFGS method
- The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique
- A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems
- A primal-dual interior-point method capable of rapidly detecting infeasibility for nonlinear programs
- A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A class of collinear scaling algorithms for bound-constrained optimization: Derivation and computational results
- A simple sufficient descent method for unconstrained optimization
- Adaptive, limited-memory BFGS algorithms for unconstrained optimization
- A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization
- A conjugate gradient method with sufficient descent property
- A note on the implementation of an interior-point algorithm for nonlinear optimization with inexact step computations
- Computational experience with penalty-barrier methods for nonlinear programming
- A superlinearly convergent SQP method without boundedness assumptions on any of the iterative sequences
- Advances in design and implementation of optimization software
- Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization
- Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization
- Some three-term conjugate gradient methods with the inexact line search condition
- A starting point strategy for nonlinear interior methods.
- Augmented Lagrangian algorithms based on the spectral projected gradient method for solving nonlinear programming problems
- A modified nonlinear conjugate gradient method with the Armijo line search and its application
- Global convergence of a nonmonotone trust region algorithm with memory for unconstrained optimization
- The TOMLAB optimization environment in MATLAB
- A feasible QP-free algorithm combining the interior-point method with active set for constrained optimization
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- Some three-term conjugate gradient methods with the new direction structure
- A filter algorithm with inexact line search
- Some sufficient descent conjugate gradient methods and their global convergence
- A globally convergent primal-dual interior-point relaxation method for nonlinear programs
- Partial spectral projected gradient method with active-set strategy for linearly constrained optimization
- A sufficient descent LS conjugate gradient method for unconstrained optimization problems
- A working set SQCQP algorithm with simple nonmonotone penalty parameters
- A modified quasi-Newton method for structured optimization with partial information on the Hessian
- A new spectral PRP conjugate gradient method with sufficient descent property
- Derivative-free nonlinear optimization filter simplex
- A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- Evaluating bound-constrained minimization software
- A modified spectral conjugate gradient method with global convergence
- Spectral method and its application to the conjugate gradient method
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
- Global and local convergence of a class of penalty-free-type methods for nonlinear programming
- Efficient tridiagonal preconditioner for the matrix-free truncated Newton method
- Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization
- A fast convergent sequential linear equation method for inequality constrained optimization without strict complementarity
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
- An inexact Newton method for nonconvex equality constrained optimization
- Scaled conjugate gradient algorithms for unconstrained optimization
- An improved strongly sub-feasible SSLE method for optimization problems and numerical experiments
- Numerical experiments with the Lancelot package (Release \(A\)) for large-scale nonlinear optimization
- A conic trust-region method and its convergence properties
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- Algorithm 851
- New hybrid conjugate gradient method for unconstrained optimization
- A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- Hybrid conjugate gradient algorithm for unconstrained optimization
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Augmented Lagrangian methods under the constant positive linear dependence constraint qualification
- Scaled memoryless symmetric rank one method for large-scale optimization
- The convergence of conjugate gradient method with nonmonotone line search
- A numerical study of limited memory BFGS methods
- Globally convergent modified Perry's conjugate gradient method
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- LANCELOT
- SPG
- L-BFGS-B
- GALAHAD
- L-BFGS
- CUTEr
- NLPQLP
- Algorithm 739
- DAFNE
- CONMIN
- Spectral scaling BFGS method
- CG_DESCENT
- ipfilter
- MINPACK-2
- NNLS
- levmar
- MINOPT
- TRON
- UFO
- FFSQP(f77)
This page was built for software: CUTE