CUTEr
From MaRDI portal
Software:16200
swMATH4010MaRDI QIDQ16200FDOQ16200
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
- Infeasibility Detection and SQP Methods for Nonlinear Optimization
- A Subspace Minimization Method for the Trust-Region Step
- On Modified Factorizations for Large-Scale Linearly Constrained Optimization
- On the Implementation of an Algorithm for Large-Scale Equality Constrained Optimization
- An active-set trust-region method for derivative-free nonlinear bound-constrained optimization
- Benchmarking Derivative-Free Optimization Algorithms
- A Multidimensional Filter Algorithm for Nonlinear Equations and Nonlinear Least-Squares
- Some modified conjugate gradient methods for unconstrained optimization
- New hybrid conjugate gradient method for unconstrained optimization
- Trust region algorithm with two subproblems for bound constrained problems
- OrthoMADS: A Deterministic MADS Instance with Orthogonal Directions
- Constraint-Style Preconditioners for Regularized Saddle Point Problems
- A Superlinearly Convergent Sequential Quadratically Constrained Quadratic Programming Algorithm for Degenerate Nonlinear Programming
- Globally convergent modified Perry's conjugate gradient method
- Implicit-Factorization Preconditioning and Iterative Solvers for Regularized Saddle-Point Systems
- A dwindling filter line search method for unconstrained optimization
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- An Active Set Newton Algorithm for Large-Scale Nonlinear Programs with Box Constraints
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A Globally Convergent Stabilized SQP Method
- A primal-dual augmented Lagrangian
- A variance-based method to rank input variables of the mesh adaptive direct search algorithm
- A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints
- A modified scaling parameter for the memoryless BFGS updating formula
- Incomplete Cholesky Factorizations with Limited Memory
- A Truncated Newton Algorithm for Large Scale Box Constrained Optimization
- Two modified scaled nonlinear conjugate gradient methods
- Augmented Lagrangian applied to convex quadratic problems
- A Sequential Linear Constraint Programming Algorithm for NLP
- A conjugate gradient method for unconstrained optimization problems
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Hybrid conjugate gradient algorithm for unconstrained optimization
- Notes on the Dai-Yuan-Yuan modified spectral gradient method
- SPARSE SECOND ORDER CONE PROGRAMMING FORMULATIONS FOR CONVEX OPTIMIZATION PROBLEMS
- A combined class of self-scaling and modified quasi-Newton methods
- A nonmonotone filter method for nonlinear optimization
- Convergence analysis of a modified BFGS method on convex minimizations
- New quasi-Newton equation and related methods for unconstrained optimization
- A modified BFGS algorithm based on a hybrid secant equation
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- A stabilized filter SQP algorithm for nonlinear programming
- Primal and dual active-set methods for convex quadratic programming
- An active set truncated Newton method for large-scale bound constrained optimization
- Global and Finite Termination of a Two-Phase Augmented Lagrangian Filter Method for General Quadratic Programs
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- The Limited Memory Conjugate Gradient Method
- Practical active-set Euclidian trust-region method with spectral projected gradients for bound-constrained minimization
- A non-monotone line search algorithm for unconstrained optimization
- A primal-dual regularized interior-point method for convex quadratic programs
- On solving trust-region and other regularised subproblems in optimization
- Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization
- A trust region affine scaling method for bound constrained optimization
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- A projected-gradient interior-point algorithm for complementarity problems
- A comparison of complete global optimization solvers
- Subspace Barzilai-Borwein gradient method for large-scale bound constrained optimization
- A trust region SQP algorithm for equality constrained parameter estimation with simple parameter bounds
- On the convergence of an inexact Gauss-Newton trust-region method for nonlinear least-squares problems with simple bounds
- A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- Globally solving nonconvex quadratic programming problems via completely positive programming
- Iterative methods for finding a trust-region step
- Benchmarking global optimization and constraint satisfaction codes
- A derivative-free algorithm for least-squares minimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- Using constraint preconditioners with regularized saddle-point problems
- A new SQP-filter method for solving nonlinear programming problems.
- A descent extension of the Polak-Ribière-Polyak conjugate gradient method
- An inexact restoration strategy for the globalization of the sSQP method
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A feasible active set method for strictly convex quadratic problems with simple bounds
- Inertia-controlling factorizations for optimization algorithms
- An interior algorithm for nonlinear optimization that combines line search and trust region steps
- How good are projection methods for convex feasibility problems?
- A new method of moving asymptotes for large-scale unconstrained optimization
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
- Recent advances in trust region algorithms
- Matching-based preprocessing algorithms to the solution of saddle-point problems in large-scale nonconvex interior-point optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- A derivative-free algorithm for inequality constrained nonlinear programming via smoothing of an \(\ell_\infty\) penalty function
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- An inexact and nonmonotone proximal method for smooth unconstrained minimization
- A Matrix-Free Algorithm for Equality Constrained Optimization Problems with Rank-Deficient Jacobians
- Mixed integer nonlinear programming tools: an updated practical overview
- A Preconditioner for Linear Systems Arising From Interior Point Optimization Methods
- Implementing Generating Set Search Methods for Linearly Constrained Minimization
- SNOPT: An SQP Algorithm for Large-Scale Constrained Optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- A globally convergent penalty-free method for optimization with equality constraints and simple bounds
- Spectral method and its application to the conjugate gradient method
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- The optimization test environment
- Title not available (Why is that?)
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
This page was built for software: CUTEr