CUTE
From MaRDI portal
Cited in
(only showing first 100 items - show all)- An affine scaling interior trust-region method combining with line search filter technique for optimization subject to bounds on variables
- An active set modified Polak-Ribiére-Polyak method for large-scale nonlinear bound constrained optimization
- A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization
- A modified spectral conjugate gradient method with global convergence
- An adaptive trust region method based on simple conic models
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- A note on the implementation of an interior-point algorithm for nonlinear optimization with inexact step computations
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- Adaptive, limited-memory BFGS algorithms for unconstrained optimization
- On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians
- Mixed integer nonlinear programming tools: a practical overview
- A nonmonotone approximate sequence algorithm for unconstrained nonlinear optimization
- A globally convergent penalty-free method for optimization with equality constraints and simple bounds
- A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
- Modified nonmonotone Armijo line search for descent method
- Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Spectral method and its application to the conjugate gradient method
- Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems
- An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- Simple sequential quadratically constrained quadratic programming feasible algorithm with active identification sets for constrained minimax problems
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- Nonmonotone adaptive trust region method with line search based on new diagonal updating
- Global convergence of quasi-Newton methods based on adjoint Broyden updates
- Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization
- Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
- Efficient tridiagonal preconditioner for the matrix-free truncated Newton method
- Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization
- A fast convergent sequential linear equation method for inequality constrained optimization without strict complementarity
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- A globally and superlinearly convergent primal-dual interior point trust region method for large scale constrained optimization
- The convergence of conjugate gradient method with nonmonotone line search
- A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A modified nonlinear conjugate gradient method with the Armijo line search and its application
- Global and local convergence of a class of penalty-free-type methods for nonlinear programming
- Benchmarking nonlinear optimization software in technical computing environments
- A simulated annealing-based Barzilai-Borwein gradient method for unconstrained optimization problems
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- A reduced proximal-point homotopy method for large-scale non-convex BQP
- A novel value for the parameter in the Dai-Liao-type conjugate gradient method
- A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice
- An improved strongly sub-feasible SSLE method for optimization problems and numerical experiments
- New hybrid conjugate gradient method for unconstrained optimization
- An inexact Newton method for nonconvex equality constrained optimization
- A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem
- Spectral scaling BFGS method
- The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique
- A new sequential systems of linear equations algorithm of feasible descent for inequality constrained optimization
- Improved Hessian approximation with modified secant equations for symmetric rank-one method
- A modified Hestenes-Stiefel conjugate gradient method with an optimal property
- Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization
- A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem
- Simulated annealing with asymptotic convergence for nonlinear constrained optimization
- Some nonlinear conjugate gradient methods based on spectral scaling secant equations
- Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
- Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization
- Globally convergent modified Perry's conjugate gradient method
- Scaled memoryless symmetric rank one method for large-scale optimization
- A spectral conjugate gradient method for solving large-scale unconstrained optimization
- Two modified Dai-Yuan nonlinear conjugate gradient methods
- A penalty-interior-point algorithm for nonlinear constrained optimization
- A globally convergent primal-dual interior-point relaxation method for nonlinear programs
- PAL-Hom method for QP and an application to LP
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- A method combining norm-relaxed QCQP subproblems with active set identification for inequality constrained optimization
- A sequential quadratic programming algorithm without a penalty function, a filter or a constraint qualification for inequality constrained optimization
- A dwindling filter line search method for unconstrained optimization
- Algorithm 851
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property
- A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A Gauss-Newton approach for solving constrained optimization problems using differentiable exact penalties
- A primal-dual augmented Lagrangian
- A conic trust-region method and its convergence properties
- Descentwise inexact proximal algorithms for smooth optimization
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- A shifted primal-dual penalty-barrier method for nonlinear optimization
- On a two-phase approximate greatest descent method for nonlinear optimization with equality constraints
- Some three-term conjugate gradient methods with the inexact line search condition
- A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints
- Two descent hybrid conjugate gradient methods for optimization
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- A framework for globally convergent algorithms using gradient bounding functions
- A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method
- The global convergence of the BFGS method with a modified WWP line search for nonconvex functions
- Scaled conjugate gradient algorithms for unconstrained optimization
- A regularized Newton method for degenerate unconstrained optimization problems
- A conjugate directions approach to improve the limited-memory BFGS method
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- Conjugate gradient methods using value of objective function for unconstrained optimization
- A modified Perry conjugate gradient method and its global convergence
- Augmented Lagrangian applied to convex quadratic problems
- A conjugate gradient method for unconstrained optimization problems
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
This page was built for software: CUTE