SifDec
From MaRDI portal
Software:16274
swMATH4087MaRDI QIDQ16274FDOQ16274
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- A modified conjugate gradient method for general convex functions
- A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems
- A Structured Quasi-Newton Algorithm for Optimizing with Incomplete Hessian Information
- A nonmonotone weighting self-adaptive trust region algorithm for unconstrained nonconvex optimization
- The flattened aggregate constraint homotopy method for nonlinear programming problems with many nonlinear constraints
- Two classes of spectral conjugate gradient methods for unconstrained optimizations
- An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction
- Active Set Identification for Linearly Constrained Minimization Without Explicit Derivatives
- Computationally Efficient Decompositions of Oblique Projection Matrices
- Mesh-based Nelder-Mead algorithm for inequality constrained optimization
- Efficient solution of quadratically constrained quadratic subproblems within the mesh adaptive direct search algorithm
- A modified secant equation quasi-Newton method for unconstrained optimization
- Asynchronous Parallel Generating Set Search for Linearly Constrained Optimization
- A comparison of reduced and unreduced KKT systems arising from interior point methods
- A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems
- Best practices for comparing optimization algorithms
- A frame-based conjugate gradients direct search method with radial basis function interpolation model
- A new superlinearly convergent algorithm of combining QP subproblem with system of linear equations for nonlinear optimization
- An improved Perry conjugate gradient method with adaptive parameter choice
- Title not available (Why is that?)
- A subspace version of the Powell-Yuan trust-region algorithm for equality constrained optimization
- Limited-memory LDL\(^{\top}\) factorization of symmetric quasi-definite matrices with application to constrained optimization
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- Efficient solution of many instances of a simulation-based optimization problem utilizing a partition of the decision space
- Compact representations of structured BFGS matrices
- Minimizing the Condition Number for Small Rank Modifications
- Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
- A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization
- A feasible filter SQP algorithm with global and local convergence
- A retrospective trust-region method for unconstrained optimization
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- A class of derivative-free nonmonotone optimization algorithms employing coordinate rotations and gradient approximations
- Corrigendum to: ``Krasnosel'skii type hybrid fixed point theorems and their applications to fractional integral equations
- Using improved directions of negative curvature for the solution of bound-constrained nonconvex problems
- A nonmonotone inexact Newton method for unconstrained optimization
- A regularized limited memory BFGS method for large-scale unconstrained optimization and its efficient implementations
- A \(q\)-Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization problems
- A mixed logarithmic barrier-augmented Lagrangian method for nonlinear optimization
- Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition
- A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization
- A dense initialization for limited-memory quasi-Newton methods
- The Mesh Adaptive Direct Search Algorithm for Granular and Discrete Variables
- Spectral analysis of the preconditioned system for the \(3 \times 3\) block saddle point problem
- Evaluating bound-constrained minimization software
- A modified nonmonotone BFGS algorithm for unconstrained optimization
- Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization
- On the use of iterative methods in cubic regularization for unconstrained optimization
- A derivative-free algorithm for systems of nonlinear inequalities
- Combining cross-entropy and MADS methods for inequality constrained global optimization
- Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method
- A Feasible Active Set Method for Strictly Convex Quadratic Problems with Simple Bounds
- SNOPT: An SQP Algorithm for Large-Scale Constrained Optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Infeasibility Detection and SQP Methods for Nonlinear Optimization
- An Interior-Point Algorithm for Large-Scale Nonlinear Optimization with Inexact Step Computations
- On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- Iterative Methods for Finding a Trust-region Step
- A Subspace Minimization Method for the Trust-Region Step
- Nonmonotone adaptive trust region method with line search based on new diagonal updating
- The optimization test environment
- A note on the use of vector barrier parameters for interior-point methods
- An active-set trust-region method for derivative-free nonlinear bound-constrained optimization
- An algorithm for nonlinear optimization using linear programming and equality constrained subproblems
- A sequential quadratic programming algorithm with an additional equality constrained phase
- Preconditioning Newton-Krylov methods in nonconvex large scale optimization
- Benchmarking nonlinear optimization software in technical computing environments
- A modified PRP conjugate gradient method
- An inexact Newton method for nonconvex equality constrained optimization
- A Multidimensional Filter Algorithm for Nonlinear Equations and Nonlinear Least-Squares
- Some modified conjugate gradient methods for unconstrained optimization
- Trust region algorithm with two subproblems for bound constrained problems
- OrthoMADS: A Deterministic MADS Instance with Orthogonal Directions
- Constraint-Style Preconditioners for Regularized Saddle Point Problems
- Finding Optimal Algorithmic Parameters Using Derivative‐Free Optimization
- A limited memory steepest descent method
- PSwarm: a hybrid solver for linearly constrained global derivative-free optimization
- A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization
- Implicit-Factorization Preconditioning and Iterative Solvers for Regularized Saddle-Point Systems
- Strategies for Scaling and Pivoting for Sparse Symmetric Indefinite Problems
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- Trust-region and other regularisations of linear least-squares problems
- A primal-dual augmented Lagrangian
- Optimization theory and methods. Nonlinear programming
- A variance-based method to rank input variables of the mesh adaptive direct search algorithm
- Iterative computation of negative curvature directions in large scale optimization
- SPARSE SECOND ORDER CONE PROGRAMMING FORMULATIONS FOR CONVEX OPTIMIZATION PROBLEMS
- An active set feasible method for large-scale minimization problems with bound constraints
- A combined class of self-scaling and modified quasi-Newton methods
- Convergence analysis of a modified BFGS method on convex minimizations
- A spectral dai-yuan-type conjugate gradient method for unconstrained optimization
- A modified BFGS algorithm based on a hybrid secant equation
- Using Sampling and Simplex Derivatives in Pattern Search Methods
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- A stabilized filter SQP algorithm for nonlinear programming
- Primal and dual active-set methods for convex quadratic programming
- New line search methods for unconstrained optimization
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- A Globally Convergent Linearly Constrained Lagrangian Method for Nonlinear Optimization
This page was built for software: SifDec