A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
From MaRDI portal
Publication:5317554
Recommendations
- A new type of descent conjugate gradient method with exact line search
- scientific article; zbMATH DE number 1092181
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- scientific article; zbMATH DE number 1022806
- A new nonlinear conjugate gradient method with guaranteed global convergence
- scientific article; zbMATH DE number 179266
- A new conjugate gradient method with the Wolfe line search
- A new conjugate gradient method for unconstrained optimization with sufficient descent
- An efficient conjugate gradient method with sufficient descent property
- A new conjugate gradient method with strongly global convergence and sufficient descent condition
Cited in
(only showing first 100 items - show all)- New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters
- An Uncertainty-Weighted Asynchronous ADMM Method for Parallel PDE Parameter Estimation
- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
- Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems
- A modified spectral conjugate gradient method with global convergence
- A robust extremum seeking scheme for dynamic systems with uncertainties and disturbances
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A derivative-free conjugate gradient method and its global convergence for solving symmetric nonlinear equations
- A modified conjugate gradient method for general convex functions
- A modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equations
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- A new conjugate gradient method with an efficient memory structure
- A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- A Fokker–Planck Feedback Control-Constrained Approach for Modelling Crowd Motion
- Proximal methods for nonlinear programming: Double regularization and inexact subproblems
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- Quantum optimal control using the adjoint method
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- A modified conjugate gradient method based on the self-scaling memoryless BFGS update
- MISTER-T: an open-source software package for quantum optimal control of multi-electron systems on arbitrary geometries
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- A Regularization Approach for an Inverse Source Problem in Elliptic Systems from Single Cauchy Data
- Application of scaled nonlinear conjugate-gradient algorithms to the inverse natural convection problem
- Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search
- A method for solving exact-controllability problems governed by closed quantum spin systems
- A nonmonotone approximate sequence algorithm for unconstrained nonlinear optimization
- A projection-based derivative free DFP approach for solving system of nonlinear convex constrained monotone equations with image restoration applications
- On a conjugate directions method for solving strictly convex QP problem
- A note on robust descent in differentiable optimization
- Solving optimal control problem of monodomain model using hybrid conjugate gradient methods
- Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search
- Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search
- Spectral method and its application to the conjugate gradient method
- A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
- Two classes of spectral conjugate gradient methods for unconstrained optimizations
- An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with applications in signal recovery and image denoising problems
- A derivative-free Liu-Storey method for solving large-scale nonlinear systems of equations
- A class of accelerated subspace minimization conjugate gradient methods
- Sufficient descent Riemannian conjugate gradient methods
- A modified Dai-Liao conjugate gradient method for solving unconstrained optimization and image restoration problems
- Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing
- Modeling and control through leadership of a refined flocking system
- A new family of hybrid three-term conjugate gradient methods with applications in image restoration
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model
- A hybrid BB-type method for solving large scale unconstrained optimization
- A derivative‐free projection method for nonlinear equations with non‐Lipschitz operator: Application to LASSO problem
- Conditional gradient method for vector optimization
- An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence
- New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods
- Global convergence of a nonlinear conjugate gradient method
- The Variational Gaussian Approximation Revisited
- New hyrid conjugate gradient method as a convex combination of HZ and CD methods
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- On the parameter estimation of Box-Cox transformation cure model
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations
- A Fokker-Planck feedback control framework for optimal personalized therapies in colon cancer-induced angiogenesis
- An accelerated relaxed-inertial strategy based CGP algorithm with restart technique for constrained nonlinear pseudo-monotone equations to image de-blurring problems
- Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- A conjugate gradient method with descent direction for unconstrained optimization
- Combining and scaling descent and negative curvature directions
- A family of gradient methods using Householder transformation with application to hypergraph partitioning
- The convergence of conjugate gradient method with nonmonotone line search
- A family of quasi-Newton methods for unconstrained optimization problems
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A trust region algorithm with conjugate gradient technique for optimization problems
- A sufficient descent Liu–Storey conjugate gradient method and its global convergence
- A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems
- A conjugate gradient method based on a modified secant relation for unconstrained optimization
- A modified nonlinear conjugate gradient method with the Armijo line search and its application
- Two modified DY conjugate gradient methods for unconstrained optimization problems
- Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- A self-adaptive projection method for nonlinear monotone equations with convex constraints
- A mini-batch stochastic conjugate gradient algorithm with variance reduction
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- An efficient conjugate gradient trust-region approach for systems of nonlinear equation
- A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems
- A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing
- Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- Stochastic three-term conjugate gradient method with variance technique for non-convex learning
- An Adjoint Method for High-Resolution EPMA Based on the Spherical Harmonics (PN) Model of Electron Transport
- Another modified version of RMIL conjugate gradient method
- A novel value for the parameter in the Dai-Liao-type conjugate gradient method
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- An Liu-Storey-type method for solving large-scale nonlinear monotone equations
- A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice
- A new subspace minimization conjugate gradient method for unconstrained minimization
- A robust BFGS algorithm for unconstrained nonlinear optimization problems
- Some modified conjugate gradient methods for unconstrained optimization
- New hybrid conjugate gradient method for unconstrained optimization
- A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems
- A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence
- Two adaptive Dai-Liao nonlinear conjugate gradient methods
This page was built for publication: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5317554)