swMATH4813MaRDI QIDQ16975FDOQ16975
Author name not available (Why is that?)
Official website: http://dl.acm.org/citation.cfm?id=1132979
Cited In (only showing first 100 items - show all)
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- An efficient hybrid conjugate gradient method for unconstrained optimization
- Reduced order optimal control of the convective FitzHugh-Nagumo equations
- Recent advances in bound constrained optimization
- Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method
- A modified three-term conjugate gradient method with sufficient descent property
- A modified conjugate gradient method for monotone nonlinear equations with convex constraints
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- Performance evaluation of descent CG methods for neural network training
- Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme
- A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing
- Quasi-Newton acceleration for equality-constrained minimization
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems
- Two adaptive Dai-Liao nonlinear conjugate gradient methods
- The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique
- BOX-QUACAN
- Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
- Nonlinear parameter optimization using R tools
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
- A conjugate gradient algorithm and its applications in image restoration
- A modified nonmonotone trust region line search method
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- Optimal control of convective FitzHugh-Nagumo equation
- A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- Conjugate gradient methods using value of objective function for unconstrained optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- A descent Dai-Liao projection method for convex constrained nonlinear monotone equations with applications
- Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- An improved Perry conjugate gradient method with adaptive parameter choice
- Preconditioned nonlinear conjugate gradient methods based on a modified secant equation
- A simple sufficient descent method for unconstrained optimization
- Two accelerated nonmonotone adaptive trust region line search methods
- A stochastic subspace approach to gradient-free optimization in high dimensions
- Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
- W-methods in optimal control
- A modified Hestense-Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A Perry-type derivative-free algorithm for solving nonlinear system of equations and minimizing \(\ell_1\) regularized problem
- Some sufficient descent conjugate gradient methods and their global convergence
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- A conjugate gradient method with sufficient descent and global convergence for unconstrained nonlinear optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- LMBOPT: a limited memory method for bound-constrained optimization
- On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- Norm descent conjugate gradient methods for solving symmetric nonlinear equations
- HypergeometricFunctions.jl
- LMBOPT
- A Barzilai-Borwein gradient projection method for sparse signal and blurred image restoration
- The global convergence of a new mixed conjugate gradient method for unconstrained optimization
- A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization
- Riemannian multigrid line search for low-rank problems
- An active set trust-region method for bound-constrained optimization
- On optimality of two adaptive choices for the parameter of Dai-Liao method
- A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization
- CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method
- Dai-Kou type conjugate gradient methods with a line search only using gradient
- A class of adaptive dai-liao conjugate gradient methods based on the scaled memoryless BFGS update
- A modified three-term PRP conjugate gradient algorithm for optimization models
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- Spectral method and its application to the conjugate gradient method
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- Numerical methods for large-scale nonlinear optimization
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- The convergence of conjugate gradient method with nonmonotone line search
- Comparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Some modified conjugate gradient methods for unconstrained optimization
- Using approximate secant equations in limited memory methods for multilevel unconstrained optimization
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Two modified scaled nonlinear conjugate gradient methods
- A conjugate gradient method for unconstrained optimization problems
- The global convergence of a descent PRP conjugate gradient method
- Second-order adjoints for solving PDE-constrained optimization problems
- A new family of conjugate gradient methods
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- The Limited Memory Conjugate Gradient Method
- Two proposals for robust PCA using semidefinite programming
- An efficient multigrid strategy for large-scale molecular mechanics optimization
- TNPACK
- CONMIN
- DONLP2
- CONV_QP
- MINPACK-2
- GPDT
- Algorithm 500
- NLPHOPDM
- CPMD
- LDGB
- Algorithm 738
- SCALCG
- CUTE
This page was built for software: CG_DESCENT