CUTE
From MaRDI portal
Software:26578
swMATH14681MaRDI QIDQ26578FDOQ26578
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- An adaptive trust region method based on simple conic models
- A note on the implementation of an interior-point algorithm for nonlinear optimization with inexact step computations
- On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians
- Mixed integer nonlinear programming tools: a practical overview
- Modified nonmonotone Armijo line search for descent method
- Simple sequential quadratically constrained quadratic programming feasible algorithm with active identification sets for constrained minimax problems
- Global convergence of quasi-Newton methods based on adjoint Broyden updates
- Title not available (Why is that?)
- A globally and superlinearly convergent primal-dual interior point trust region method for large scale constrained optimization
- A modified nonlinear conjugate gradient method with the Armijo line search and its application
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique
- Simulated annealing with asymptotic convergence for nonlinear constrained optimization
- A penalty-interior-point algorithm for nonlinear constrained optimization
- Two modified Dai-Yuan nonlinear conjugate gradient methods
- Descentwise inexact proximal algorithms for smooth optimization
- On a two-phase approximate greatest descent method for nonlinear optimization with equality constraints
- Some three-term conjugate gradient methods with the inexact line search condition
- A conjugate directions approach to improve the limited-memory BFGS method
- A regularized Newton method for degenerate unconstrained optimization problems
- Title not available (Why is that?)
- A modified Perry conjugate gradient method and its global convergence
- The global convergence of a descent PRP conjugate gradient method
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
- Global convergence of a spectral conjugate gradient method for unconstrained optimization
- A new family of conjugate gradient methods
- Interior-point methods for nonconvex nonlinear programming: cubic regularization
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A simple sufficient descent method for unconstrained optimization
- A filter algorithm with inexact line search
- A trust region method for optimization problem with singular solutions
- Nonconvex optimization using negative curvature within a modified linesearch
- Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization
- Nonmonotone curvilinear line search methods for unconstrained optimization
- A subspace implementation of quasi-Newton trust region methods for unconstrained optimization
- Some three-term conjugate gradient methods with the new direction structure
- Global and local convergence of a nonmonotone SQP method for constrained nonlinear optimization
- Self-adaptive inexact proximal point methods
- A new \(\varepsilon \)-generalized projection method of strongly sub-feasible directions for inequality constrained optimization
- A sufficient descent LS conjugate gradient method for unconstrained optimization problems
- A working set SQCQP algorithm with simple nonmonotone penalty parameters
- A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization
- A trajectory-based method for constrained nonlinear optimization problems
- Global convergence of a nonmonotone trust region algorithm with memory for unconstrained optimization
- A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems
- A superlinearly convergent strongly sub-feasible SSLE-type algorithm with working set for nonlinearly constrained optimization
- Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization
- A robust implementation of a sequential quadratic programming algorithm with successive error restoration
- A limited memory descent Perry conjugate gradient method
- A modified CG-DESCENT method for unconstrained optimization
- On the method of shortest residuals for unconstrained optimization
- A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization
- A feasible QP-free algorithm combining the interior-point method with active set for constrained optimization
- An active set modified Polak-Ribiére-Polyak method for large-scale nonlinear bound constrained optimization
- Dai-Kou type conjugate gradient methods with a line search only using gradient
- A modified three-term PRP conjugate gradient algorithm for optimization models
- A modified spectral conjugate gradient method with global convergence
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- A globally convergent penalty-free method for optimization with equality constraints and simple bounds
- Spectral method and its application to the conjugate gradient method
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- Nonmonotone adaptive trust region method with line search based on new diagonal updating
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
- Efficient tridiagonal preconditioner for the matrix-free truncated Newton method
- Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization
- A fast convergent sequential linear equation method for inequality constrained optimization without strict complementarity
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- The convergence of conjugate gradient method with nonmonotone line search
- Benchmarking nonlinear optimization software in technical computing environments
- Global and local convergence of a class of penalty-free-type methods for nonlinear programming
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- An inexact Newton method for nonconvex equality constrained optimization
- An improved strongly sub-feasible SSLE method for optimization problems and numerical experiments
- New hybrid conjugate gradient method for unconstrained optimization
- A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem
- Spectral scaling BFGS method
- A new sequential systems of linear equations algorithm of feasible descent for inequality constrained optimization
- Improved Hessian approximation with modified secant equations for symmetric rank-one method
- A spectral conjugate gradient method for solving large-scale unconstrained optimization
- Scaled memoryless symmetric rank one method for large-scale optimization
- Globally convergent modified Perry's conjugate gradient method
- A dwindling filter line search method for unconstrained optimization
- Algorithm 851
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A conic trust-region method and its convergence properties
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- A primal-dual augmented Lagrangian
- Two descent hybrid conjugate gradient methods for optimization
- A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints
- Scaled conjugate gradient algorithms for unconstrained optimization
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- Augmented Lagrangian applied to convex quadratic problems
- A conjugate gradient method for unconstrained optimization problems
This page was built for software: CUTE