Exploiting damped techniques for nonlinear conjugate gradient methods
From MaRDI portal
Publication:684134
DOI10.1007/S00186-017-0593-1zbMATH Open1390.90388OpenAlexW2617669545MaRDI QIDQ684134FDOQ684134
Authors: Mehiddin Al-Baali, Andrea Caliciotti, Giovanni Fasano, Massimo Roma
Publication date: 9 February 2018
Published in: Mathematical Methods of Operations Research (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/10278/3694943
Recommendations
- Quasi-Newton based preconditioning and damped quasi-Newton schemes for nonlinear conjugate gradient methods
- Preconditioned nonlinear conjugate gradient methods based on a modified secant equation
- Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods
- Damped techniques for the limited memory BFGS method for large-scale optimization
- Improved damped quasi-Newton methods for unconstrained optimization
nonlinear conjugate gradient methodslarge scale unconstrained optimizationquasi-Newton updatesdamped techniques
Cites Work
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- Benchmarking optimization software with performance profiles.
- Title not available (Why is that?)
- Line search algorithms with guaranteed sufficient decrease
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Title not available (Why is that?)
- A novel class of approximate inverse preconditioners for large positive definite linear systems in optimization
- On A Class of Limited Memory Preconditioners For Large Scale Linear Systems With Multiple Right-Hand Sides
- Preconditioning Newton-Krylov methods in nonconvex large scale optimization
- Automatic Preconditioning by Limited Memory Quasi-Newton Updating
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Damped techniques for the limited memory BFGS method for large-scale optimization
- On practical modifications of the quasi-Newton BFGS method
- Algorithms for nonlinear constraints that use lagrangian functions
- Methods of unconstrained optimization
- The Limited Memory Conjugate Gradient Method
- Conjugate gradient algorithms in nonconvex optimization
- How bad are the BFGS and DFP methods when the objective function is quadratic?
- Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods
- Convergence conditions, line search algorithms and trust region implementations for the Polak–Ribière conjugate gradient method
- Damped techniques for enforcing convergence of quasi-Newton methods
- Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems
- On the Order of Convergence of Preconditioned Nonlinear Conjugate Gradient Methods
- Improved damped quasi-Newton methods for unconstrained optimization
Cited In (13)
- Title not available (Why is that?)
- Improved damped quasi-Newton methods for unconstrained optimization
- Preconditioned nonlinear conjugate gradient methods based on a modified secant equation
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- On the inclusion of damping terms in the hyperbolic MBO algorithm
- A new conjugate gradient method based on quasi-Newton equation for unconstrained optimization
- Damped techniques for enforcing convergence of quasi-Newton methods
- A class of approximate inverse preconditioners based on Krylov-subspace methods for large-scale nonconvex optimization
- Gradient method with multiple damping for large-scale unconstrained optimization
- A sufficient descent nonlinear conjugate gradient method for solving \(\mathcal{M} \)-tensor equations
- Real-time pricing method for smart grid based on social welfare maximization model
- Damped techniques for the limited memory BFGS method for large-scale optimization
- Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods
Uses Software
This page was built for publication: Exploiting damped techniques for nonlinear conjugate gradient methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q684134)