CUTE

From MaRDI portal
Revision as of 00:55, 7 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:4371578

DOI10.1145/200979.201043zbMath0886.65058OpenAlexW1968553199WikidataQ113310435 ScholiaQ113310435MaRDI QIDQ4371578

No author found.

Publication date: 26 January 1998

Published in: ACM Transactions on Mathematical Software (Search for Journal in Brave)

Full work available at URL: http://purl.org/net/epubs/work/56246



Related Items

The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations, On the performance of switching BFGS/SR1 algorithms for unconstrained optimization, A new framework for the computation of Hessians, Simultaneous iterative solutions for the trust-region and minimum eigenvalue subproblem, A modified sufficient descent Polak-Ribiére-Polyak type conjugate gradient method for unconstrained optimization problems, An augmented Lagrangian affine scaling method for nonlinear programming, Unnamed Item, Matching-based preprocessing algorithms to the solution of saddle-point problems in large-scale nonconvex interior-point optimization, An optimal control framework for dynamic induction control of wind farms and their interaction with the atmospheric boundary layer, Shifted L-BFGS systems, Cubic overestimation and secant updating for unconstrained optimization ofC2, 1functions, An infeasible QP-free algorithm without a penalty function or a filter for nonlinear inequality-constrained optimization, A new nonmonotone trust-region method of conic model for solving unconstrained optimization, New three-term conjugate gradient method with guaranteed global convergence, A method combining norm-relaxed QCQP subproblems with active set identification for inequality constrained optimization, A sequential quadratic programming algorithm without a penalty function, a filter or a constraint qualification for inequality constrained optimization, Globally convergence of nonlinear conjugate gradient method for unconstrained optimization, An efficient projection-based algorithm without Lipschitz continuity for large-scale nonlinear pseudo-monotone equations, A modified limited-memory BNS method for unconstrained minimization based on the conjugate directions idea, A decent three term conjugate gradient method with global convergence properties for large scale unconstrained optimization problems, New conjugate gradient method for unconstrained optimization, Gradient method with multiple damping for large-scale unconstrained optimization, Global convergence of a modified Fletcher–Reeves conjugate gradient method with Wolfe line search, A scaled nonlinear conjugate gradient algorithm for unconstrained optimization, A new method of moving asymptotes for large-scale linearly equality-constrained minimization, Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search, Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition, A nonmonotone approximate sequence algorithm for unconstrained nonlinear optimization, Computational experiments with scaled initial hessian approximation for the broyden family methods, An accurate active set conjugate gradient algorithm with project search for bound constrained optimization, A conjugate directions approach to improve the limited-memory BFGS method, Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization, An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method, Improving solver success in reaching feasibility for sets of nonlinear constraints, A Subspace Modified PRP Method for Large-scale Nonlinear Box-Constrained Optimization, A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method, A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization, A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization, Tuning Algorithms for Stochastic Black-Box Optimization: State of the Art and Future Perspectives, Augmented Lagrangian methods under the constant positive linear dependence constraint qualification, A NONMONOTONE FILTER BARZILAI-BORWEIN METHOD FOR OPTIMIZATION, A CONVEX APPROXIMATION METHOD FOR LARGE SCALE LINEAR INEQUALITY CONSTRAINED MINIMIZATION, A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization, A matrix-free line-search algorithm for nonconvex optimization, On the method of shortest residuals for unconstrained optimization, The Sequential Quadratic Programming Method, Scaled conjugate gradient algorithms for unconstrained optimization, Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization, Self-adaptive inexact proximal point methods, Conjugate gradient (CG)-type method for the solution of Newton's equation within optimization frameworks, A trust region method for optimization problem with singular solutions, Another hybrid conjugate gradient algorithm for unconstrained optimization, Nonconvex optimization using negative curvature within a modified linesearch, Two descent hybrid conjugate gradient methods for optimization, A class of diagonal preconditioners for limited memory BFGS method, Primal-dual nonlinear rescaling method with dynamic scaling parameter update, A new family of penalties for augmented Lagrangian methods, An interior algorithm for nonlinear optimization that combines line search and trust region steps, Solving mathematical programs with complementarity constraints as nonlinear programs, Convergence of nonmonotone line search method, A nonmonotone Broyden method for unconstrained optimization, Mathematical programming models and algorithms for engineering design optimization, An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion, A Simulated Annealing-Based Barzilai–Borwein Gradient Method for Unconstrained Optimization Problems, A subspace implementation of quasi-Newton trust region methods for unconstrained optimization, An Affine Scaling Interior Point Filter Line-Search Algorithm for Linear Inequality Constrained Minimization, An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization, Some sufficient descent conjugate gradient methods and their global convergence, A QP-free algorithm of quasi-strongly sub-feasible directions for inequality constrained optimization, A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization, A Framework for Simulating and Estimating the State and Functional Topology of Complex Dynamic Geometric Networks, Dynamic scaling based preconditioning for truncated Newton methods in large scale unconstrained optimization, A Shifted Primal-Dual Penalty-Barrier Method for Nonlinear Optimization, A new method of moving asymptotes for large-scale unconstrained optimization, A new family of conjugate gradient methods, A sufficient descent Liu–Storey conjugate gradient method and its global convergence, Nonlinear analysis: optimization methods, convergence theory, and applications, Corrigendum to: ``Krasnosel'skii type hybrid fixed point theorems and their applications to fractional integral equations, A globally convergent primal-dual interior-point relaxation method for nonlinear programs, Derivative-free nonlinear optimization filter simplex, A sufficient descent conjugate gradient method and its global convergence, Scaled memoryless BFGS preconditioned steepest descent method for very large-scale unconstrained optimization, Derivative-free optimization and filter methods to solve nonlinear constrained problems, Adaptive, Limited-Memory BFGS Algorithms for Unconstrained Optimization, Structured symmetric rank-one method for unconstrained optimization, Sequential Quadratic Optimization for Nonlinear Equality Constrained Stochastic Optimization, AN ADAPTIVE GRADIENT ALGORITHM FOR LARGE-SCALE NONLINEAR BOUND CONSTRAINED OPTIMIZATION, A primal-dual interior-point algorithm for nonlinear least squares constrained problems, Planar conjugate gradient algorithm for large-scale unconstrained optimization. I: Theory, Planar conjugate gradient algorithm for large-scale unconstrained optimization. II: Application, Descent Property and Global Convergence of a New Search Direction Method for Unconstrained Optimization, A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization, A preconditioner for solving large-scale variational inequality problems by a semismooth inexact approach, MULTIPLE USE OF BACKTRACKING LINE SEARCH IN UNCONSTRAINED OPTIMIZATION, A NEW THREE–TERM CONJUGATE GRADIENT METHOD WITH DESCENT DIRECTION FOR UNCONSTRAINED OPTIMIZATION, LOQO:an interior point code for quadratic programming, A repository of convex quadratic programming problems, SDPLIB 1.2, a library of semidefinite programming test problems, Extended Dai-Yuan conjugate gradient strategy for large-scale unconstrained optimization with applications to compressive sensing, A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization, A globally convergent penalty-free method for optimization with equality constraints and simple bounds, An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property, Efficient tridiagonal preconditioner for the matrix-free truncated Newton method, Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization, A fast convergent sequential linear equation method for inequality constrained optimization without strict complementarity, On Hager and Zhang's conjugate gradient method with guaranteed descent, New hybrid conjugate gradient method for unconstrained optimization, Spectral method and its application to the conjugate gradient method, A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem, A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints, A family of second-order methods for convex \(\ell _1\)-regularized optimization, Primal and dual active-set methods for convex quadratic programming, An active set truncated Newton method for large-scale bound constrained optimization, An inexact Newton method for nonconvex equality constrained optimization, Partial spectral projected gradient method with active-set strategy for linearly constrained optimization, Sufficient descent nonlinear conjugate gradient methods with conjugacy condition, Local convergence of an inexact-restoration method and numerical experiments, A modified quasi-Newton method for structured optimization with partial information on the Hessian, Monotone projected gradient methods for large-scale box-constrained quadratic programming, On the performance of a new symmetric rank-one method with restart for solving unconstrained optimization problems, A new subspace limited memory BFGS algorithm for large-scale bound constrained optimization, An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems, A restoration-free filter SQP algorithm for equality constrained optimization, Spectral scaling BFGS method, A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, A practical relative error criterion for augmented Lagrangians, The convergence of conjugate gradient method with nonmonotone line search, Global convergence of a modified limited memory BFGS method for non-convex minimization, A simple sufficient descent method for unconstrained optimization, On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians, An adaptive trust region method based on simple conic models, A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties, A new sequential systems of linear equations algorithm of feasible descent for inequality constrained optimization, Improved Hessian approximation with modified secant equations for symmetric rank-one method, A modified conjugate gradient algorithm with cyclic Barzilai-Borwein steplength for unconstrained optimization, An active-set projected trust region algorithm for box constrained optimization problems, Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property, A primal-dual augmented Lagrangian, Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, Modified active set projected spectral gradient method for bound constrained optimization, An active set limited memory BFGS algorithm for bound constrained optimization, Globally convergent modified Perry's conjugate gradient method, Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization, Global convergence of a spectral conjugate gradient method for unconstrained optimization, Global convergence of some modified PRP nonlinear conjugate gradient methods, A new \(\varepsilon \)-generalized projection method of strongly sub-feasible directions for inequality constrained optimization, Interior-point methods for nonconvex nonlinear programming: cubic regularization, A symmetric rank-one method based on extra updating techniques for unconstrained optimization, A sufficient descent LS conjugate gradient method for unconstrained optimization problems, A working set SQCQP algorithm with simple nonmonotone penalty parameters, An active set limited memory BFGS algorithm for large-scale bound constrained optimization, Nonmonotone Barzilai-Borwein gradient algorithm for \(\ell_1\)-regularized nonsmooth minimization in compressive sensing, Augmented Lagrangian applied to convex quadratic problems, Global and local convergence of a nonmonotone SQP method for constrained nonlinear optimization, Global convergence of a nonmonotone trust region algorithm with memory for unconstrained optimization, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction, Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization, Norm descent conjugate gradient methods for solving symmetric nonlinear equations, Simulated annealing with asymptotic convergence for nonlinear constrained optimization, Modified subspace limited memory BFGS algorithm for large-scale bound constrained optimization, A limited memory descent Perry conjugate gradient method, Dai-Kou type conjugate gradient methods with a line search only using gradient, A modified three-term PRP conjugate gradient algorithm for optimization models, An improved strongly sub-feasible SSLE method for optimization problems and numerical experiments, A conjugate gradient method for unconstrained optimization problems, Global and local convergence of a class of penalty-free-type methods for nonlinear programming, A nonmonotone filter method for nonlinear optimization, Modified nonmonotone Armijo line search for descent method, A modified CG-DESCENT method for unconstrained optimization, A robust implementation of a sequential quadratic programming algorithm with successive error restoration, Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization, A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization, A feasible QP-free algorithm combining the interior-point method with active set for constrained optimization, Notes on the Dai-Yuan-Yuan modified spectral gradient method, Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints, New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization, Scaled memoryless symmetric rank one method for large-scale optimization, Some three-term conjugate gradient methods with the inexact line search condition, A norm descent derivative-free algorithm for solving large-scale nonlinear symmetric equations, A trajectory-based method for constrained nonlinear optimization problems, A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems, A superlinearly convergent strongly sub-feasible SSLE-type algorithm with working set for nonlinearly constrained optimization, A limited memory BFGS-type method for large-scale unconstrained optimization, Global convergence of quasi-Newton methods based on adjoint Broyden updates, Two modified Dai-Yuan nonlinear conjugate gradient methods, A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization, A conjugate gradient method with sufficient descent property, Subspace Barzilai-Borwein gradient method for large-scale bound constrained optimization, CUTE, A globally convergent BFGS method with nonmonotone line search for non-convex minimization, A class of collinear scaling algorithms for bound-constrained optimization: Derivation and computational results, Hybrid conjugate gradient algorithm for unconstrained optimization, Acceleration of conjugate gradient algorithms for unconstrained optimization, A conic trust-region method and its convergence properties, Mixed integer nonlinear programming tools: a practical overview, Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization, A truncated descent HS conjugate gradient method and its global convergence, An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation, A primal-dual interior-point method capable of rapidly detecting infeasibility for nonlinear programs, A new simple model trust-region method with generalized Barzilai-Borwein parameter for large-scale optimization, Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization, Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property, A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization, A descent hybrid conjugate gradient method based on the memoryless BFGS update, An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition, Corrected sequential linear programming for sparse minimax optimization, A globally convergent hybrid conjugate gradient method and its numerical behaviors, A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method, Best practices for comparing optimization algorithms, Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization, Global convergence and the Powell singular function, A penalty-interior-point algorithm for nonlinear constrained optimization, Recent progress in unconstrained nonlinear optimization without derivatives, \(n\)-step quadratic convergence of a restart Liu-Storey type method, Mixed integer nonlinear programming tools: an updated practical overview, An improved Perry conjugate gradient method with adaptive parameter choice, The global convergence of the BFGS method with a modified WWP line search for nonconvex functions, Line search filter inexact secant methods for nonlinear equality constrained optimization, A class of one parameter conjugate gradient methods, A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization, Numerical experiments with the Lancelot package (Release \(A\)) for large-scale nonlinear optimization, Computational experience with penalty-barrier methods for nonlinear programming, Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search, Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization, Global convergence of a modified conjugate gradient method, Two new conjugate gradient methods for unconstrained optimization, A double parameter scaled BFGS method for unconstrained optimization, A note on the implementation of an interior-point algorithm for nonlinear optimization with inexact step computations, A regularized Newton method for degenerate unconstrained optimization problems, Evaluating bound-constrained minimization software, PAL-Hom method for QP and an application to LP, Diagonal approximation of the Hessian by finite differences for unconstrained optimization, An active set modified Polak-Ribiére-Polyak method for large-scale nonlinear bound constrained optimization, Descentwise inexact proximal algorithms for smooth optimization, A class of accelerated conjugate-gradient-like methods based on a modified secant equation, New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method, A double parameter self-scaling memoryless BFGS method for unconstrained optimization, A penalty-free method with superlinear convergence for equality constrained optimization, Benchmarking nonlinear optimization software in technical computing environments, A spectral conjugate gradient method for solving large-scale unconstrained optimization, A Gauss-Newton approach for solving constrained optimization problems using differentiable exact penalties, A filter algorithm with inexact line search, On a two-phase approximate greatest descent method for nonlinear optimization with equality constraints, Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization, Reverse bridge theorem under constraint partition, A derivative-free Liu-Storey method for solving large-scale nonlinear systems of equations, A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions, A modified conjugacy condition and related nonlinear conjugate gradient method, Sufficient descent Polak-Ribière-Polyak conjugate gradient algorithm for large-scale box-constrained optimization, A survey of gradient methods for solving nonlinear optimization, A hybrid of DL and WYL nonlinear conjugate gradient methods, A sequential quadratic programming with a dual parametrization approach to nonlinear semi-infinite programming, Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm, A novel value for the parameter in the Dai-Liao-type conjugate gradient method, Cubic regularization in symmetric rank-1 quasi-Newton methods, An affine scaling interior trust-region method combining with line search filter technique for optimization subject to bounds on variables, A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues, Simple sequential quadratically constrained quadratic programming feasible algorithm with active identification sets for constrained minimax problems, A superlinearly convergent SQP method without boundedness assumptions on any of the iterative sequences, Conjugate gradient methods using value of objective function for unconstrained optimization, A modified nonlinear conjugate gradient method with the Armijo line search and its application, Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization, A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem, Nonmonotone adaptive trust region method with line search based on new diagonal updating, A starting point strategy for nonlinear interior methods., A globally and superlinearly convergent primal-dual interior point trust region method for large scale constrained optimization, Some nonlinear conjugate gradient methods based on spectral scaling secant equations, Derivative-free restrictively preconditioned conjugate gradient path method without line search technique for solving linear equality constrained optimization, Nonmonotone strategy for minimization of quadratics with simple constraints., Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, A new family of conjugate gradient methods for unconstrained optimization, An efficient modified AZPRP conjugate gradient method for large-scale unconstrained optimization problem, Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search, A spectral three-term Hestenes-Stiefel conjugate gradient method, Some three-term conjugate gradient methods with the new direction structure, Two limited-memory optimization methods with minimum violation of the previous secant conditions, Nonmonotone curvilinear line search methods for unconstrained optimization, Adaptive scaling damped BFGS method without gradient Lipschitz continuity, Behavior of the combination of PRP and HZ methods for unconstrained optimization, The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique, Least-squares-based three-term conjugate gradient methods, A modified spectral conjugate gradient method with global convergence, A framework for globally convergent algorithms using gradient bounding functions, Two improved nonlinear conjugate gradient methods with the strong Wolfe line search, Global convergence of a modified spectral three-term CG algorithm for nonconvex unconstrained optimization problems, A three-term conjugate gradient method with accelerated subspace quadratic optimization, A hybrid conjugate gradient method with descent property for unconstrained optimization, A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method, A class of line search-type methods for nonsmooth convex regularized minimization, Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems, An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems, Advances in design and implementation of optimization software, A primal-dual interior-point relaxation method with global and rapidly local convergence for nonlinear programs, A numerical study of limited memory BFGS methods, Inertia-controlling factorizations for optimization algorithms, CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization, A reduced proximal-point homotopy method for large-scale non-convex BQP, An adaptive sizing BFGS method for unconstrained optimization, A modified Perry conjugate gradient method and its global convergence, Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination, An efficient modified residual-based algorithm for large scale symmetric nonlinear equations by approximating successive iterated gradients, A modified four-term extension of the Dai-Liao conjugate gradient method, A fast inertial self-adaptive projection based algorithm for solving large-scale nonlinear monotone equations, Unnamed Item, An adaptive modified three-term conjugate gradient method with global convergence, Alternating cyclic vector extrapolation technique for accelerating nonlinear optimization algorithms and fixed-point mapping applications, Solving Unconstrained Optimization Problems with Some Three-term Conjugate Gradient Methods, A three-term projection method based on spectral secant equation for nonlinear monotone equations, Unnamed Item, A modified Hestenes–Stiefel conjugate gradient method with an optimal property, A New Diagonal Quasi-Newton Updating Method With Scaled Forward Finite Differences Directional Derivative for Unconstrained Optimization, Unnamed Item, Interior point methods for large-scale nonlinear programming, A modified Polak–Ribi‘ere–Polyak descent method for unconstrained optimization, A derivative-free PRP method for solving large-scale nonlinear systems of equations and its global convergence, Affine conjugate adaptive Newton methods for nonlinear elastomechanics, Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, A nonmonotone semismooth inexact Newton method, Some descent three-term conjugate gradient methods and their global convergence, Sprouting search—an algorithmic framework for asynchronous parallel unconstrained optimization, A Two-Term PRP-Based Descent Method, Higher-order reverse automatic differentiation with emphasis on the third-order, Practical active-set Euclidian trust-region method with spectral projected gradients for bound-constrained minimization, A dwindling filter line search method for unconstrained optimization, Computing the sparsity pattern of Hessians using automatic differentiation, A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice, A new spectral conjugate gradient method for large-scale unconstrained optimization, Algorithm 943


Uses Software