Cited in
(only showing first 100 items - show all)- Preconditioning saddle-point systems with applications in optimization
- New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters
- An affine scaling interior trust-region method combining with line search filter technique for optimization subject to bounds on variables
- An active set modified Polak-Ribiére-Polyak method for large-scale nonlinear bound constrained optimization
- A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization
- A modified spectral conjugate gradient method with global convergence
- A Preconditioner for Linear Systems Arising From Interior Point Optimization Methods
- An adaptive trust region method based on simple conic models
- Implementing Generating Set Search Methods for Linearly Constrained Minimization
- A modified conjugate gradient method for general convex functions
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- A note on the implementation of an interior-point algorithm for nonlinear optimization with inexact step computations
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- A Structured Quasi-Newton Algorithm for Optimizing with Incomplete Hessian Information
- MINPACK
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- Adaptive, limited-memory BFGS algorithms for unconstrained optimization
- A matrix-free approach to build band preconditioners for large-scale bound-constrained optimization
- A regularized Newton method without line search for unconstrained optimization
- On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians
- Sobolev seminorm of quadratic functions with applications to derivative-free optimization
- A new framework for the computation of Hessians
- A primal-dual interior-point algorithm for nonlinear least squares constrained problems
- Mixed integer nonlinear programming tools: a practical overview
- A nonmonotone approximate sequence algorithm for unconstrained nonlinear optimization
- A nonmonotone weighting self-adaptive trust region algorithm for unconstrained nonconvex optimization
- The flattened aggregate constraint homotopy method for nonlinear programming problems with many nonlinear constraints
- An indicator for the switch from derivative-free to derivative-based optimization
- Recent advances in bound constrained optimization
- A globally convergent penalty-free method for optimization with equality constraints and simple bounds
- A modified three-term conjugate gradient method with sufficient descent property
- A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
- Modified nonmonotone Armijo line search for descent method
- Efficient use of parallelism in algorithmic parameter optimization applications
- scientific article; zbMATH DE number 1568986 (Why is no real title available?)
- Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Spectral method and its application to the conjugate gradient method
- Two classes of spectral conjugate gradient methods for unconstrained optimizations
- Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems
- An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems
- A primal-dual interior-point relaxation method with global and rapidly local convergence for nonlinear programs
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- A local search method for costly black-box problems and its application to CSP plant start-up optimization refinement
- SNOPT: An SQP Algorithm for Large-Scale Constrained Optimization
- Infeasibility Detection and SQP Methods for Nonlinear Optimization
- Simple sequential quadratically constrained quadratic programming feasible algorithm with active identification sets for constrained minimax problems
- Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing
- A new family of hybrid three-term conjugate gradient methods with applications in image restoration
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming
- Nonmonotone adaptive trust region method with line search based on new diagonal updating
- The optimization test environment
- A Subspace Minimization Method for the Trust-Region Step
- Global convergence of quasi-Newton methods based on adjoint Broyden updates
- A framework for simulating and estimating the state and functional topology of complex dynamic geometric networks
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction
- Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization
- Low-rank update of preconditioners for the inexact Newton method with SPD Jacobian
- A note on the use of vector barrier parameters for interior-point methods
- On Modified Factorizations for Large-Scale Linearly Constrained Optimization
- Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm
- Inverse problems and solution methods for a class of nonlinear complementarity problems
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property
- Efficient tridiagonal preconditioner for the matrix-free truncated Newton method
- Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization
- A fast convergent sequential linear equation method for inequality constrained optimization without strict complementarity
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- Convergence of the nonmonotone Perry and Shanno method for optimization
- A globally and superlinearly convergent primal-dual interior point trust region method for large scale constrained optimization
- Combining and scaling descent and negative curvature directions
- scientific article; zbMATH DE number 2063453 (Why is no real title available?)
- The convergence of conjugate gradient method with nonmonotone line search
- Computationally Efficient Decompositions of Oblique Projection Matrices
- An active-set trust-region method for derivative-free nonlinear bound-constrained optimization
- A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- On the Implementation of an Algorithm for Large-Scale Equality Constrained Optimization
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A symmetric rank-one quasi-Newton line-search method using negative curvature directions
- A sufficient descent Liu–Storey conjugate gradient method and its global convergence
- An algorithm for nonlinear optimization using linear programming and equality constrained subproblems
- A conjugate gradient method based on a modified secant relation for unconstrained optimization
- An Algebraic Analysis of a Block Diagonal Preconditioner for Saddle Point Systems
- Designing an optimal search algorithm with respect to prior information
- Preconditioning Newton-Krylov methods in nonconvex large scale optimization
- A three-term conjugate gradient method with sufficient descent property for unconstrained optimization
- Combining DCA (DC Algorithms) and interior point techniques for large-scale nonconvex quadratic programming
- A modified nonlinear conjugate gradient method with the Armijo line search and its application
- Cubic regularization in symmetric rank-1 quasi-Newton methods
- Global and local convergence of a class of penalty-free-type methods for nonlinear programming
- Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme
- A sequential quadratic programming algorithm with an additional equality constrained phase
- Benchmarking nonlinear optimization software in technical computing environments
- A simulated annealing-based Barzilai-Borwein gradient method for unconstrained optimization problems
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Interior-point methods for nonconvex nonlinear programming: Regularization and warmstarts
This page was built for software: CUTEr