Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization (Q442712): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
ReferenceBot (talk | contribs)
Changed an Item
 
(7 intermediate revisions by 5 users not shown)
Property / review text
 
Conjugate gradient methods are very efficient for solving large-scale unconstrained optimization problems. The main advantage of these methods is that no matrices are used which highly decreases storage requirements. When computing a search direction, a lot of various choices for the parameter \(\beta\) (characterizing the conjugate gradient method) and their modifications exist. In order to incorporate the second-order information of the objective function into conjugate gradient methods, many variants based on secant conditions have been proposed. Although such methods have global convergence property, they do not necessarily satisfy the (sufficient) descent condition. In this paper, the authors propose new efficient conjugate gradient methods for solving large-scale unconstrained optimization problems that generate descent search directions and are globally convergent for general objective functions. The methods combine ideas based on the secant condition proposed by \textit{Y.-H. Dai} and \textit{L.-Z. Liao} [Appl. Math. Optim. 43, No. 1, 87--101 (2001; Zbl 0973.65050)] and the formula for \(\beta\) proposed by\textit{W. W. Hager} and \textit{H. Zhang} [SIAM J. Optim. 16, No. 1, 170--192 (2005; Zbl 1093.90085)].
Property / review text: Conjugate gradient methods are very efficient for solving large-scale unconstrained optimization problems. The main advantage of these methods is that no matrices are used which highly decreases storage requirements. When computing a search direction, a lot of various choices for the parameter \(\beta\) (characterizing the conjugate gradient method) and their modifications exist. In order to incorporate the second-order information of the objective function into conjugate gradient methods, many variants based on secant conditions have been proposed. Although such methods have global convergence property, they do not necessarily satisfy the (sufficient) descent condition. In this paper, the authors propose new efficient conjugate gradient methods for solving large-scale unconstrained optimization problems that generate descent search directions and are globally convergent for general objective functions. The methods combine ideas based on the secant condition proposed by \textit{Y.-H. Dai} and \textit{L.-Z. Liao} [Appl. Math. Optim. 43, No. 1, 87--101 (2001; Zbl 0973.65050)] and the formula for \(\beta\) proposed by\textit{W. W. Hager} and \textit{H. Zhang} [SIAM J. Optim. 16, No. 1, 170--192 (2005; Zbl 1093.90085)]. / rank
 
Normal rank
Property / reviewed by
 
Property / reviewed by: Ctirad Matonoha / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 65K05 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 90C30 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 90C06 / rank
 
Normal rank
Property / zbMATH DE Number
 
Property / zbMATH DE Number: 6063164 / rank
 
Normal rank
Property / zbMATH Keywords
 
unconstrained optimization
Property / zbMATH Keywords: unconstrained optimization / rank
 
Normal rank
Property / zbMATH Keywords
 
conjugate gradient method
Property / zbMATH Keywords: conjugate gradient method / rank
 
Normal rank
Property / zbMATH Keywords
 
descent search direction
Property / zbMATH Keywords: descent search direction / rank
 
Normal rank
Property / zbMATH Keywords
 
secant condition
Property / zbMATH Keywords: secant condition / rank
 
Normal rank
Property / zbMATH Keywords
 
global convergence
Property / zbMATH Keywords: global convergence / rank
 
Normal rank
Property / zbMATH Keywords
 
large-scale
Property / zbMATH Keywords: large-scale / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: CUTE / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: CUTEr / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: CG_DESCENT / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/j.cam.2012.01.036 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2170270075 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Methods of conjugate gradients for solving linear systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Function minimization by conjugate gradients / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5491447 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global Convergence Properties of Conjugate Gradient Methods for Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5479892 / rank
 
Normal rank
Property / cites work
 
Property / cites work: New conjugacy conditions and related nonlinear conjugate gradient methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition / rank
 
Normal rank
Property / cites work
 
Property / cites work: Multi-step nonlinear conjugate gradient methods for unconstrained minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A nonlinear conjugate gradient method based on the MBFGS secant condition / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property / rank
 
Normal rank
Property / cites work
 
Property / cites work: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Technical Note—A Modified Conjugate Gradient Algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: New quasi-Newton equation and related methods for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations / rank
 
Normal rank
Property / cites work
 
Property / cites work: A modified BFGS method and its global convergence in nonconvex minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Multi-step quasi-Newton methods for optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4103338 / rank
 
Normal rank
Property / cites work
 
Property / cites work: CUTE / rank
 
Normal rank
Property / cites work
 
Property / cites work: Algorithm 851 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Benchmarking optimization software with performance profiles. / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 13:09, 5 July 2024

scientific article
Language Label Description Also known as
English
Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
scientific article

    Statements

    Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization (English)
    0 references
    0 references
    0 references
    3 August 2012
    0 references
    Conjugate gradient methods are very efficient for solving large-scale unconstrained optimization problems. The main advantage of these methods is that no matrices are used which highly decreases storage requirements. When computing a search direction, a lot of various choices for the parameter \(\beta\) (characterizing the conjugate gradient method) and their modifications exist. In order to incorporate the second-order information of the objective function into conjugate gradient methods, many variants based on secant conditions have been proposed. Although such methods have global convergence property, they do not necessarily satisfy the (sufficient) descent condition. In this paper, the authors propose new efficient conjugate gradient methods for solving large-scale unconstrained optimization problems that generate descent search directions and are globally convergent for general objective functions. The methods combine ideas based on the secant condition proposed by \textit{Y.-H. Dai} and \textit{L.-Z. Liao} [Appl. Math. Optim. 43, No. 1, 87--101 (2001; Zbl 0973.65050)] and the formula for \(\beta\) proposed by\textit{W. W. Hager} and \textit{H. Zhang} [SIAM J. Optim. 16, No. 1, 170--192 (2005; Zbl 1093.90085)].
    0 references
    unconstrained optimization
    0 references
    conjugate gradient method
    0 references
    descent search direction
    0 references
    secant condition
    0 references
    global convergence
    0 references
    large-scale
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers