swMATH14681MaRDI QIDQ26578FDOQ26578
Author name not available (Why is that?)
Official website: http://dl.acm.org/citation.cfm?doid=200979.201043
Cited In (only showing first 100 items - show all)
- A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- A globally convergent penalty-free method for optimization with equality constraints and simple bounds
- Spectral method and its application to the conjugate gradient method
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
- Efficient tridiagonal preconditioner for the matrix-free truncated Newton method
- Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization
- A fast convergent sequential linear equation method for inequality constrained optimization without strict complementarity
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- The convergence of conjugate gradient method with nonmonotone line search
- Global and local convergence of a class of penalty-free-type methods for nonlinear programming
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- An inexact Newton method for nonconvex equality constrained optimization
- An improved strongly sub-feasible SSLE method for optimization problems and numerical experiments
- New hybrid conjugate gradient method for unconstrained optimization
- A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem
- Spectral scaling BFGS method
- A new sequential systems of linear equations algorithm of feasible descent for inequality constrained optimization
- Improved Hessian approximation with modified secant equations for symmetric rank-one method
- Scaled memoryless symmetric rank one method for large-scale optimization
- Globally convergent modified Perry's conjugate gradient method
- A dwindling filter line search method for unconstrained optimization
- Algorithm 851
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A conic trust-region method and its convergence properties
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- Scaled conjugate gradient algorithms for unconstrained optimization
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- Augmented Lagrangian applied to convex quadratic problems
- A conjugate gradient method for unconstrained optimization problems
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Hybrid conjugate gradient algorithm for unconstrained optimization
- A numerical study of limited memory BFGS methods
- Augmented Lagrangian methods under the constant positive linear dependence constraint qualification
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- A repository of convex quadratic programming problems
- Algorithm 739
- DAFNE
- CONMIN
- CG_DESCENT
- ipfilter
- MINPACK-2
- NNLS
- levmar
- MINOPT
- TRON
- UFO
- FFSQP(f77)
- MPSreader
- Algorithm 500
- minpack
- tn
- ve08
- BQPD
- GQTPAR
- QPOPT
- COPS
- MSS
- SCALCG
- ZQPCVX
- OSL
- edge_push_sp
- hess_pat
- MA27
- QPLIB2014
- QL
- NLPIP
- Algorithm 709
- LINUOA
- ACGSSV
- LSA
- NLPNET
- Algorithm 566
- PPCG_lrep
- MMLA1Q
- A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- An interior algorithm for nonlinear optimization that combines line search and trust region steps
- A new method of moving asymptotes for large-scale unconstrained optimization
- Matching-based preprocessing algorithms to the solution of saddle-point problems in large-scale nonconvex interior-point optimization
- Recent progress in unconstrained nonlinear optimization without derivatives
- An implementation of Shor's \(r\)-algorithm
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Numerical experiments with the Lancelot package (Release \(A\)) for large-scale nonlinear optimization
- Algorithm 943: MSS: MATLAB software for L-BFGS trust-region subproblems for large-scale optimization
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
- An adaptive trust region method based on simple conic models
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians
- Mixed integer nonlinear programming tools: a practical overview
- Modified nonmonotone Armijo line search for descent method
- Simple sequential quadratically constrained quadratic programming feasible algorithm with active identification sets for constrained minimax problems
- Nonmonotone adaptive trust region method with line search based on new diagonal updating
- Global convergence of quasi-Newton methods based on adjoint Broyden updates
This page was built for software: CUTE