CUTEr
From MaRDI portal
Software:16200
swMATH4010MaRDI QIDQ16200FDOQ16200
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- SNOPT: An SQP Algorithm for Large-Scale Constrained Optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- An Interior-Point Algorithm for Large-Scale Nonlinear Optimization with Inexact Step Computations
- A Subspace Modified PRP Method for Large-scale Nonlinear Box-Constrained Optimization
- A globally convergent penalty-free method for optimization with equality constraints and simple bounds
- Spectral method and its application to the conjugate gradient method
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- The optimization test environment
- Title not available (Why is that?)
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
- Efficient tridiagonal preconditioner for the matrix-free truncated Newton method
- Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization
- A fast convergent sequential linear equation method for inequality constrained optimization without strict complementarity
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- The convergence of conjugate gradient method with nonmonotone line search
- CasADi: A Symbolic Package for Automatic Differentiation and Optimal Control
- Preconditioning Newton-Krylov methods in nonconvex large scale optimization
- Global and local convergence of a class of penalty-free-type methods for nonlinear programming
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Direct Search Based on Probabilistic Descent
- A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems
- A modified PRP conjugate gradient method
- An inexact Newton method for nonconvex equality constrained optimization
- Nonlinear programming without a penalty function or a filter
- An improved strongly sub-feasible SSLE method for optimization problems and numerical experiments
- Spectral scaling BFGS method
- A new sequential systems of linear equations algorithm of feasible descent for inequality constrained optimization
- Improved Hessian approximation with modified secant equations for symmetric rank-one method
- Recognizing underlying sparsity in optimization
- A limited memory steepest descent method
- PSwarm: a hybrid solver for linearly constrained global derivative-free optimization
- Scaled memoryless symmetric rank one method for large-scale optimization
- A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization
- Worst case complexity of direct search under convexity
- Computing sparse Hessians with automatic differentiation
- Trust-region and other regularisations of linear least-squares problems
- A conic trust-region method and its convergence properties
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- Optimization theory and methods. Nonlinear programming
- Scaled conjugate gradient algorithms for unconstrained optimization
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- Algorithm 778: L-BFGS-B
- Algorithm 765: STENMIN—a software package for large, sparse unconstrained optimization using tensor methods
- A numerical study of limited memory BFGS methods
- On the behavior of the gradient norm in the steepest descent method
- SDPLIB 1.2, a library of semidefinite programming test problems
- A spectral dai-yuan-type conjugate gradient method for unconstrained optimization
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Second-order negative-curvature methods for box-constrained and general constrained optimization
- Augmented Lagrangian methods under the constant positive linear dependence constraint qualification
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Title not available (Why is that?)
- A repository of convex quadratic programming problems
- Solving the Trust-Region Subproblem using the Lanczos Method
- CUTE
- Algorithm 943
- MA57---a code for the solution of sparse symmetric definite and indefinite systems
- TRESNEI, a MATLAB trust-region solver for systems of nonlinear equalities and inequalities
- Algorithm 813
- GALAHAD, a library of thread-safe Fortran 90 packages for large-scale nonlinear optimization
- CONDOR, a new parallel, constrained extension of Powell's UOBYQA algorithm: Experimental results and comparison with the DFO algorithm
- Optimization of algorithms with OPAL
- Globally Solving Nonconvex Quadratic Programs via Linear Integer Programming Techniques
- A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization
- Algorithm 909
- On the solution of equality constrained quadratic programming problems arising in optimization
- Numerical solution of saddle point problems
- A trust region method based on a new affine scaling technique for simple bounded optimization
- Improving ultimate convergence of an augmented Lagrangian method
- On Augmented Lagrangian Methods with General Lower-Level Constraints
- Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization
- qpOASES: a parametric active-set algorithm for~quadratic programming
- Dynamic Control of Infeasibility in Equality Constrained Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Subspace, Interior, and Conjugate Gradient Method for Large-Scale Bound-Constrained Minimization Problems
- Recent progress in unconstrained nonlinear optimization without derivatives
- Dynamic scaling based preconditioning for truncated Newton methods in large scale unconstrained optimization
- A New Active Set Algorithm for Box Constrained Optimization
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- Symmetric Perry conjugate gradient method
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Some descent three-term conjugate gradient methods and their global convergence
- MINQ8: general definite and bound constrained indefinite quadratic programming
- Preprocessing for quadratic programming
- A coordinate gradient descent method for nonsmooth separable minimization
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
- An adaptive trust region method based on simple conic models
- A new framework for the computation of Hessians
- A primal-dual interior-point algorithm for nonlinear least squares constrained problems
- A regularized Newton method without line search for unconstrained optimization
- Mixed integer nonlinear programming tools: a practical overview
- Efficient use of parallelism in algorithmic parameter optimization applications
- A modified three-term conjugate gradient method with sufficient descent property
This page was built for software: CUTEr