Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization (Q442712): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Importer (talk | contribs)
Changed an Item
Property / review text
 
Conjugate gradient methods are very efficient for solving large-scale unconstrained optimization problems. The main advantage of these methods is that no matrices are used which highly decreases storage requirements. When computing a search direction, a lot of various choices for the parameter \(\beta\) (characterizing the conjugate gradient method) and their modifications exist. In order to incorporate the second-order information of the objective function into conjugate gradient methods, many variants based on secant conditions have been proposed. Although such methods have global convergence property, they do not necessarily satisfy the (sufficient) descent condition. In this paper, the authors propose new efficient conjugate gradient methods for solving large-scale unconstrained optimization problems that generate descent search directions and are globally convergent for general objective functions. The methods combine ideas based on the secant condition proposed by \textit{Y.-H. Dai} and \textit{L.-Z. Liao} [Appl. Math. Optim. 43, No. 1, 87--101 (2001; Zbl 0973.65050)] and the formula for \(\beta\) proposed by\textit{W. W. Hager} and \textit{H. Zhang} [SIAM J. Optim. 16, No. 1, 170--192 (2005; Zbl 1093.90085)].
Property / review text: Conjugate gradient methods are very efficient for solving large-scale unconstrained optimization problems. The main advantage of these methods is that no matrices are used which highly decreases storage requirements. When computing a search direction, a lot of various choices for the parameter \(\beta\) (characterizing the conjugate gradient method) and their modifications exist. In order to incorporate the second-order information of the objective function into conjugate gradient methods, many variants based on secant conditions have been proposed. Although such methods have global convergence property, they do not necessarily satisfy the (sufficient) descent condition. In this paper, the authors propose new efficient conjugate gradient methods for solving large-scale unconstrained optimization problems that generate descent search directions and are globally convergent for general objective functions. The methods combine ideas based on the secant condition proposed by \textit{Y.-H. Dai} and \textit{L.-Z. Liao} [Appl. Math. Optim. 43, No. 1, 87--101 (2001; Zbl 0973.65050)] and the formula for \(\beta\) proposed by\textit{W. W. Hager} and \textit{H. Zhang} [SIAM J. Optim. 16, No. 1, 170--192 (2005; Zbl 1093.90085)]. / rank
 
Normal rank
Property / reviewed by
 
Property / reviewed by: Ctirad Matonoha / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 65K05 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 90C30 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 90C06 / rank
 
Normal rank
Property / zbMATH DE Number
 
Property / zbMATH DE Number: 6063164 / rank
 
Normal rank
Property / zbMATH Keywords
 
unconstrained optimization
Property / zbMATH Keywords: unconstrained optimization / rank
 
Normal rank
Property / zbMATH Keywords
 
conjugate gradient method
Property / zbMATH Keywords: conjugate gradient method / rank
 
Normal rank
Property / zbMATH Keywords
 
descent search direction
Property / zbMATH Keywords: descent search direction / rank
 
Normal rank
Property / zbMATH Keywords
 
secant condition
Property / zbMATH Keywords: secant condition / rank
 
Normal rank
Property / zbMATH Keywords
 
global convergence
Property / zbMATH Keywords: global convergence / rank
 
Normal rank
Property / zbMATH Keywords
 
large-scale
Property / zbMATH Keywords: large-scale / rank
 
Normal rank

Revision as of 02:26, 30 June 2023

scientific article
Language Label Description Also known as
English
Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
scientific article

    Statements

    Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization (English)
    0 references
    0 references
    0 references
    3 August 2012
    0 references
    Conjugate gradient methods are very efficient for solving large-scale unconstrained optimization problems. The main advantage of these methods is that no matrices are used which highly decreases storage requirements. When computing a search direction, a lot of various choices for the parameter \(\beta\) (characterizing the conjugate gradient method) and their modifications exist. In order to incorporate the second-order information of the objective function into conjugate gradient methods, many variants based on secant conditions have been proposed. Although such methods have global convergence property, they do not necessarily satisfy the (sufficient) descent condition. In this paper, the authors propose new efficient conjugate gradient methods for solving large-scale unconstrained optimization problems that generate descent search directions and are globally convergent for general objective functions. The methods combine ideas based on the secant condition proposed by \textit{Y.-H. Dai} and \textit{L.-Z. Liao} [Appl. Math. Optim. 43, No. 1, 87--101 (2001; Zbl 0973.65050)] and the formula for \(\beta\) proposed by\textit{W. W. Hager} and \textit{H. Zhang} [SIAM J. Optim. 16, No. 1, 170--192 (2005; Zbl 1093.90085)].
    0 references
    0 references
    0 references
    0 references
    0 references
    unconstrained optimization
    0 references
    conjugate gradient method
    0 references
    descent search direction
    0 references
    secant condition
    0 references
    global convergence
    0 references
    large-scale
    0 references