swMATH11893MaRDI QIDQ23829FDOQ23829
Author name not available (Why is that?)
Official website: https://github.com/ralna/CUTEst/wiki
Source code repository: https://github.com/ralna/CUTEst
Cited In (only showing first 100 items - show all)
- A new conjugate gradient method with an efficient memory structure
- An efficient hybrid conjugate gradient method for unconstrained optimization
- A null-space approach for large-scale symmetric saddle point systems with a small and non zero \((2, 2)\) block
- An augmented Lagrangian filter method
- Error estimates for iterative algorithms for minimizing regularized quadratic subproblems
- An improved hybrid-ORBIT algorithm based on point sorting and MLE technique
- An efficient nonmonotone adaptive cubic regularization method with line search for unconstrained optimization problem
- EQ
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- A line-search algorithm inspired by the adaptive cubic regularization framework and complexity analysis
- Exploiting negative curvature in deterministic and stochastic optimization
- MINQ8
- A derivative-free exact penalty algorithm: basic ideas, convergence theory and computational studies
- High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms
- Issues on the use of a modified bunch and Kaufman decomposition for large scale Newton's equation
- A regularized factorization-free method for equality-constrained optimization
- A note on solving nonlinear optimization problems in variable precision
- Iterative grossone-based computation of negative curvature directions in large-scale optimization
- Implementing a smooth exact penalty function for equality-constrained nonlinear optimization
- Approximate solution of system of equations arising in interior-point methods for bound-constrained optimization
- Stochastic mesh adaptive direct search for blackbox optimization using probabilistic estimates
- Secant update generalized version of PSB: a new approach
- A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds
- Improving the flexibility and robustness of model-based derivative-free optimization solvers
- An adaptive truncation criterion, for linesearch-based truncated Newton methods in large scale nonconvex optimization
- Limited-memory BFGS with displacement aggregation
- A comparative study of null-space factorizations for sparse symmetric saddle point systems.
- A new augmented Lagrangian method for equality constrained optimization with simple unconstrained subproblem
- An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
- Direct search based on probabilistic feasible descent for bound and linearly constrained problems
- Backward step control for global Newton-type methods
- A derivative-free Gauss-Newton method
- PDFO
- On global minimizers of quadratic functions with cubic regularization
- Preconditioning of linear least squares by robust incomplete factorization for implicitly held normal equations
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- Learning to steer nonlinear interior-point methods
- Efficient Preconditioners for Interior Point Methods via a New Schur Complement-Based Strategy
- Equipping the Barzilai-Borwein method with the two dimensional quadratic termination property
- An inexact first-order method for constrained nonlinear optimization
- An interior-point implementation developed and tuned for radiation therapy treatment planning
- Adaptive trust-region algorithms for unconstrained optimization
- Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities
- Global convergence of a modified spectral three-term CG algorithm for nonconvex unconstrained optimization problems
- Efficient unconstrained black box optimization
- paper-regularized-qn-benchmark
- Two-step conjugate gradient method for unconstrained optimization
- On the solution of linearly constrained optimization problems by means of barrier algorithms
- Null-space preconditioners for saddle point systems
- Gradient methods exploiting spectral properties
- On Using Cholesky-Based Factorizations and Regularization for Solving Rank-Deficient Sparse Linear Least-Squares Problems
- On efficiency of nonmonotone Armijo-type line searches
- A new regularized quasi-Newton method for unconstrained optimization
- An accelerated nonmonotone trust region method with adaptive trust region for unconstrained optimization
- Augmented Lagrangians with constrained subproblems and convergence to second-order stationary points
- Exploiting damped techniques for nonlinear conjugate gradient methods
- A regularization method for constrained nonlinear least squares
- An infeasible interior-point arc-search algorithm for nonlinear constrained optimization
- Block preconditioners for linear systems in interior point methods for convex constrained optimization
- On the performance of SQP methods for nonlinear optimization
- A two-stage active-set algorithm for bound-constrained optimization
- A new restarting adaptive trust-region method for unconstrained optimization
- A limited memory quasi-Newton trust-region method for box constrained optimization
- A progressive barrier derivative-free trust-region algorithm for constrained optimization
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
- BOX-QUACAN
- Robust optimization of noisy blackbox problems using the mesh adaptive direct search algorithm
- A Derivative-Free Method for Structured Optimization Problems
- A Schur complement approach to preconditioning sparse linear least-squares problems with some dense rows
- The use of quadratic regularization with a cubic descent condition for unconstrained optimization
- Solving mixed sparse-dense linear least-squares problems by preconditioned iterative methods
- Exact linesearch limited-memory quasi-Newton methods for minimizing a quadratic function
- Updating constraint preconditioners for KKT systems in quadratic programming via low-rank corrections
- Best practices for comparing optimization algorithms
- Speeding up the convergence of the Polyak's heavy ball algorithm
- Scaled projected-directions methods with application to transmission tomography
- A solver for nonconvex bound-constrained quadratic optimization
- Linear equalities in blackbox optimization
- Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems
- Sequential equality-constrained optimization for nonlinear programming
- Primal and dual active-set methods for convex quadratic programming
- A primal-dual augmented Lagrangian penalty-interior-point filter line search algorithm
- Preconditioned nonlinear conjugate gradient methods based on a modified secant equation
- A tridiagonalization method for symmetric saddle-point systems
- Complexity and performance of an augmented Lagrangian algorithm
- An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation
- An active-set algorithm for norm constrained quadratic problems
- Diagonal BFGS updates and applications to the limited memory BFGS method
- A new derivative-free conjugate gradient method for large-scale nonlinear systems of equations
- On the update of constraint preconditioners for regularized KKT systems
- Two globally convergent nonmonotone trust-region methods for unconstrained optimization
- A novel class of approximate inverse preconditioners for large positive definite linear systems in optimization
- Assessing the reliability of general-purpose inexact restoration methods
- DFBOX_IMPR
- JuliaStats
- Dynamic scaling in the mesh adaptive direct search algorithm for blackbox optimization
- GEMS
- Olympus
- NC-OPT
- dfppm
This page was built for software: CUTEst