swMATH14681MaRDI QIDQ26578FDOQ26578
Author name not available (Why is that?)
Official website: http://dl.acm.org/citation.cfm?doid=200979.201043
Cited In (only showing first 100 items - show all)
- An adaptive trust region method based on simple conic models
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians
- Mixed integer nonlinear programming tools: a practical overview
- Modified nonmonotone Armijo line search for descent method
- Simple sequential quadratically constrained quadratic programming feasible algorithm with active identification sets for constrained minimax problems
- Nonmonotone adaptive trust region method with line search based on new diagonal updating
- Global convergence of quasi-Newton methods based on adjoint Broyden updates
- A globally and superlinearly convergent primal-dual interior point trust region method for large scale constrained optimization
- Benchmarking nonlinear optimization software in technical computing environments
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- Simulated annealing with asymptotic convergence for nonlinear constrained optimization
- A spectral conjugate gradient method for solving large-scale unconstrained optimization
- A penalty-interior-point algorithm for nonlinear constrained optimization
- Two modified Dai-Yuan nonlinear conjugate gradient methods
- Descentwise inexact proximal algorithms for smooth optimization
- A primal-dual augmented Lagrangian
- On a two-phase approximate greatest descent method for nonlinear optimization with equality constraints
- Two descent hybrid conjugate gradient methods for optimization
- A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints
- A regularized Newton method for degenerate unconstrained optimization problems
- A modified Perry conjugate gradient method and its global convergence
- Improving solver success in reaching feasibility for sets of nonlinear constraints
- The global convergence of a descent PRP conjugate gradient method
- Convergence of nonmonotone line search method
- Mathematical programming models and algorithms for engineering design optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
- A nonmonotone filter method for nonlinear optimization
- Global convergence of a spectral conjugate gradient method for unconstrained optimization
- The TOMLAB optimization environment in MATLAB
- Primal and dual active-set methods for convex quadratic programming
- An active set truncated Newton method for large-scale bound constrained optimization
- A new family of conjugate gradient methods
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Interior-point methods for nonconvex nonlinear programming: cubic regularization
- A new family of penalties for augmented Lagrangian methods
- A trust region method for optimization problem with singular solutions
- Nonconvex optimization using negative curvature within a modified linesearch
- Nonmonotone curvilinear line search methods for unconstrained optimization
- A subspace implementation of quasi-Newton trust region methods for unconstrained optimization
- Global and local convergence of a nonmonotone SQP method for constrained nonlinear optimization
- A new spectral PRP conjugate gradient method with sufficient descent property
- Self-adaptive inexact proximal point methods
- On the performance of a new symmetric rank-one method with restart for solving unconstrained optimization problems
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Inertia-controlling factorizations for optimization algorithms
- A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization
- An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems
- A restoration-free filter SQP algorithm for equality constrained optimization
- A symmetric rank-one method based on extra updating techniques for unconstrained optimization
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- A trajectory-based method for constrained nonlinear optimization problems
- Nonmonotone strategy for minimization of quadratics with simple constraints.
- A practical relative error criterion for augmented Lagrangians
- Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search
- A superlinearly convergent strongly sub-feasible SSLE-type algorithm with working set for nonlinearly constrained optimization
- A robust implementation of a sequential quadratic programming algorithm with successive error restoration
- A limited memory descent Perry conjugate gradient method
- Sufficient descent nonlinear conjugate gradient methods with conjugacy condition
- A modified CG-DESCENT method for unconstrained optimization
- Monotone projected gradient methods for large-scale box-constrained quadratic programming
- On the method of shortest residuals for unconstrained optimization
- New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization
- Mixed integer nonlinear programming tools: an updated practical overview
- An active set modified Polak-Ribiére-Polyak method for large-scale nonlinear bound constrained optimization
- Dai-Kou type conjugate gradient methods with a line search only using gradient
- A modified three-term PRP conjugate gradient algorithm for optimization models
- A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- A globally convergent penalty-free method for optimization with equality constraints and simple bounds
- Spectral method and its application to the conjugate gradient method
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
- Efficient tridiagonal preconditioner for the matrix-free truncated Newton method
- Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization
- A fast convergent sequential linear equation method for inequality constrained optimization without strict complementarity
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- The convergence of conjugate gradient method with nonmonotone line search
- Global and local convergence of a class of penalty-free-type methods for nonlinear programming
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- An inexact Newton method for nonconvex equality constrained optimization
- An improved strongly sub-feasible SSLE method for optimization problems and numerical experiments
- New hybrid conjugate gradient method for unconstrained optimization
- A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem
- Spectral scaling BFGS method
- A new sequential systems of linear equations algorithm of feasible descent for inequality constrained optimization
- Improved Hessian approximation with modified secant equations for symmetric rank-one method
- Scaled memoryless symmetric rank one method for large-scale optimization
- Globally convergent modified Perry's conjugate gradient method
- A dwindling filter line search method for unconstrained optimization
- Algorithm 851
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A conic trust-region method and its convergence properties
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
This page was built for software: CUTE