CG_DESCENT
From MaRDI portal
Software:16975
swMATH4813MaRDI QIDQ16975FDOQ16975
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- An efficient hybrid conjugate gradient method for unconstrained optimization
- Title not available (Why is that?)
- Towards a comprehensive approach to optimal control of non-ideal binary batch distillation
- A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- A class of accelerated subspace minimization conjugate gradient methods
- Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing
- A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A modified conjugate gradient method for monotone nonlinear equations with convex constraints
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A Perry-type derivative-free algorithm for solving nonlinear system of equations and minimizing ℓ1regularized problem
- A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems
- An efficient adaptive scaling parameter for the spectral conjugate gradient method
- Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
- The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations
- A conjugate gradient algorithm and its applications in image restoration
- INITIAL IMPROVEMENT OF THE HYBRID ACCELERATED GRADIENT DESCENT PROCESS
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions
- Conjugate gradient methods using value of objective function for unconstrained optimization
- Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- An improved Perry conjugate gradient method with adaptive parameter choice
- Preconditioned nonlinear conjugate gradient methods based on a modified secant equation
- A descent hybrid modification of the Polak-Ribière-Polyak conjugate gradient method
- Optimal scaling parameters for spectral conjugate gradient methods
- A stochastic subspace approach to gradient-free optimization in high dimensions
- Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
- W-methods in optimal control
- An extended delayed weighted gradient algorithm for solving strongly convex optimization problems
- Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions
- Some sufficient descent conjugate gradient methods and their global convergence
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- Computing equilibrium measures with power law kernels
- A conjugate gradient method with sufficient descent and global convergence for unconstrained nonlinear optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- Two limited-memory optimization methods with minimum violation of the previous secant conditions
- Riemannian Multigrid Line Search for Low-Rank Problems
- LMBOPT: a limited memory method for bound-constrained optimization
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
- Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications
- Optimal vaccination strategy for an SIRS model with imprecise parameters and Lévy noise
- A Barzilai-Borwein gradient projection method for sparse signal and blurred image restoration
- The global convergence of a new mixed conjugate gradient method for unconstrained optimization
- A modified PRP-type conjugate gradient projection algorithm for solving large-scale monotone nonlinear equations with convex constraint
- Convergence analysis of a nonmonotone projected gradient method for multiobjective optimization problems
- A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization
- The projection technique for two open problems of unconstrained optimization problems
- A survey of gradient methods for solving nonlinear optimization
- An active set trust-region method for bound-constrained optimization
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
- Nonlinear Parameter Optimization Using R Tools
- A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems
- A nonmonotone scaled Fletcher-Reeves conjugate gradient method with application in image reconstruction
- A modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equations
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- Reduced order optimal control of the convective FitzHugh-Nagumo equations
- Recent advances in bound constrained optimization
- Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method
- A modified three-term conjugate gradient method with sufficient descent property
- Spectral method and its application to the conjugate gradient method
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- Numerical methods for large-scale nonlinear optimization
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- The convergence of conjugate gradient method with nonmonotone line search
- Comparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems
- Performance evaluation of descent CG methods for neural network training
- Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Quasi-Newton acceleration for equality-constrained minimization
- Two adaptive Dai-Liao nonlinear conjugate gradient methods
- Some modified conjugate gradient methods for unconstrained optimization
- The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique
- Using approximate secant equations in limited memory methods for multilevel unconstrained optimization
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A modified nonmonotone trust region line search method
- Optimal control of convective FitzHugh-Nagumo equation
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- Two modified scaled nonlinear conjugate gradient methods
- A conjugate gradient method for unconstrained optimization problems
- The global convergence of a descent PRP conjugate gradient method
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- Second-order adjoints for solving PDE-constrained optimization problems
- A new family of conjugate gradient methods
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- The Limited Memory Conjugate Gradient Method
- Two proposals for robust PCA using semidefinite programming
- An efficient multigrid strategy for large-scale molecular mechanics optimization
- A simple sufficient descent method for unconstrained optimization
- Two accelerated nonmonotone adaptive trust region line search methods
- A descent Dai-Liao conjugate gradient method for nonlinear equations
This page was built for software: CG_DESCENT