A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications
DOI10.1007/s11253-015-1118-9zbMath1352.65160OpenAlexW2189424086MaRDI QIDQ326018
Publication date: 12 October 2016
Published in: Ukrainian Mathematical Journal (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11253-015-1118-9
algorithmconvergencenonconvex optimizationcomputational efficiencyconjugate gradient methodsMoreau-Yosida regularizationlarge-scale unconstrained optimizationnonsmooth convex optimizationHestenes-Stiefel methodPolak-Ribiére-Polyak method
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonconvex programming, global optimization (90C26) Complexity and performance of numerical algorithms (65Y20)
Uses Software
Cites Work
- Unnamed Item
- Recent progress in unconstrained nonlinear optimization without derivatives
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- An efficient hybrid conjugate gradient method for unconstrained optimization
This page was built for publication: A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications