CUTEst
From MaRDI portal
Software:23829
swMATH11893MaRDI QIDQ23829FDOQ23829
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- On Using Cholesky-Based Factorizations and Regularization for Solving Rank-Deficient Sparse Linear Least-Squares Problems
- Iterative Solution of Symmetric Quasi-Definite Linear Systems
- On efficiency of nonmonotone Armijo-type line searches
- A Feasible Active Set Method for Strictly Convex Quadratic Problems with Simple Bounds
- A Nonmonotone Filter SQP Method: Local Convergence and Numerical Results
- A new regularized quasi-Newton method for unconstrained optimization
- An accelerated nonmonotone trust region method with adaptive trust region for unconstrained optimization
- Augmented Lagrangians with constrained subproblems and convergence to second-order stationary points
- Exploiting damped techniques for nonlinear conjugate gradient methods
- A regularization method for constrained nonlinear least squares
- An infeasible interior-point arc-search algorithm for nonlinear constrained optimization
- Block preconditioners for linear systems in interior point methods for convex constrained optimization
- A line-search algorithm inspired by the adaptive cubic regularization framework and complexity analysis
- A two-stage active-set algorithm for bound-constrained optimization
- A new restarting adaptive trust-region method for unconstrained optimization
- A limited memory quasi-Newton trust-region method for box constrained optimization
- A Tridiagonalization Method for Symmetric Saddle-Point Systems
- A progressive barrier derivative-free trust-region algorithm for constrained optimization
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
- A NEW DERIVATIVE-FREE CONJUGATE GRADIENT METHOD FOR LARGE-SCALE NONLINEAR SYSTEMS OF EQUATIONS
- Robust optimization of noisy blackbox problems using the mesh adaptive direct search algorithm
- A Derivative-Free Method for Structured Optimization Problems
- Complexity and performance of an Augmented Lagrangian algorithm
- A Schur complement approach to preconditioning sparse linear least-squares problems with some dense rows
- On Regularization and Active-set Methods with Complexity for Constrained Optimization
- Exact linesearch limited-memory quasi-Newton methods for minimizing a quadratic function
- Best practices for comparing optimization algorithms
- Speeding up the convergence of the Polyak's heavy ball algorithm
- Scaled projected-directions methods with application to transmission tomography
- Linear equalities in blackbox optimization
- Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems
- Sequential equality-constrained optimization for nonlinear programming
- Primal and dual active-set methods for convex quadratic programming
- trlib: a vector-free implementation of the GLTR method for iterative solution of the trust region problem
- A primal-dual augmented Lagrangian penalty-interior-point filter line search algorithm
- Preconditioned nonlinear conjugate gradient methods based on a modified secant equation
- BFO, A Trainable Derivative-free Brute Force Optimizer for Nonlinear Bound-constrained Optimization and Equilibrium Computations with Continuous and Discrete Variables
- An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation
- An active-set algorithm for norm constrained quadratic problems
- Diagonal BFGS updates and applications to the limited memory BFGS method
- On the update of constraint preconditioners for regularized KKT systems
- Two globally convergent nonmonotone trust-region methods for unconstrained optimization
- A novel class of approximate inverse preconditioners for large positive definite linear systems in optimization
- Updating Constraint Preconditioners for KKT Systems in Quadratic Programming Via Low-Rank Corrections
- Assessing the reliability of general-purpose inexact restoration methods
- Dynamic scaling in the mesh adaptive direct search algorithm for blackbox optimization
- Non-monotone algorithm for minimization on arbitrary domains with applications to large-scale orthogonal Procrustes problem
- LMBOPT: a limited memory method for bound-constrained optimization
- Algebraic rules for computing the regularization parameter of the Levenberg-Marquardt method
- Sequential quadratic programming methods for parametric nonlinear optimization
- Efficient Preconditioners for Interior Point Methods via a New Schur Complement-Based Strategy
- An augmented Lagrangian method exploiting an active-set strategy and second-order information
- Primal-dual active-set methods for large-scale optimization
- An interior-point implementation developed and tuned for radiation therapy treatment planning
- A stabilized SQP method: superlinear convergence
- A Solver for Nonconvex Bound-Constrained Quadratic Optimization
- On the Performance of SQP Methods for Nonlinear Optimization
- Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition
- Solving Mixed Sparse-Dense Linear Least-Squares Problems by Preconditioned Iterative Methods
- A subspace SQP method for equality constrained optimization
- The State-of-the-Art of Preconditioners for Sparse Linear Least-Squares Problems
- An active set trust-region method for bound-constrained optimization
- Methods for convex and general quadratic programming
- On the complexity of solving feasibility problems with regularized models
- A new class of conjugate gradient methods for unconstrained smooth optimization and absolute value equations
- QPALM: a proximal augmented Lagrangian method for nonconvex quadratic programs
- An improved adaptive trust-region algorithm
- An extended nonmonotone line search technique for large-scale unconstrained optimization
- The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization
- Null-space preconditioners for saddle point systems
- A dual gradient-projection method for large-scale strictly convex quadratic problems
- A sequential quadratic programming algorithm for equality-constrained optimization without derivatives
- An inexact proximal regularization method for unconstrained optimization
- Global optimization test problems based on random field composition
- Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods
- Improving the Flexibility and Robustness of Model-based Derivative-free Optimization Solvers
- A new conjugate gradient method with an efficient memory structure
- A comparative study of null‐space factorizations for sparse symmetric saddle point systems
- An efficient hybrid conjugate gradient method for unconstrained optimization
- A null-space approach for large-scale symmetric saddle point systems with a small and non zero \((2, 2)\) block
- The Conjugate Residual Method in Linesearch and Trust-Region Methods
- An augmented Lagrangian filter method
- Error estimates for iterative algorithms for minimizing regularized quadratic subproblems
- An improved hybrid-ORBIT algorithm based on point sorting and MLE technique
- An efficient nonmonotone adaptive cubic regularization method with line search for unconstrained optimization problem
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- Exploiting negative curvature in deterministic and stochastic optimization
- A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice
- A derivative-free exact penalty algorithm: basic ideas, convergence theory and computational studies
- A Class of Approximate Inverse Preconditioners Based on Krylov-Subspace Methods for Large-Scale Nonconvex Optimization
- Trust-Region Newton-CG with Strong Second-Order Complexity Guarantees for Nonconvex Optimization
- Equipping the Barzilai--Borwein Method with the Two Dimensional Quadratic Termination Property
- A Noise-Tolerant Quasi-Newton Algorithm for Unconstrained Optimization
- Adaptive Finite-Difference Interval Estimation for Noisy Derivative-Free Optimization
- High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms
- Issues on the use of a modified bunch and Kaufman decomposition for large scale Newton's equation
- Full-low evaluation methods for derivative-free optimization
- A note on solving nonlinear optimization problems in variable precision
- An efficient hybrid conjugate gradient method with sufficient descent property for unconstrained optimization
- Iterative grossone-based computation of negative curvature directions in large-scale optimization
This page was built for software: CUTEst