A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization (Q1985287): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
Property / cites work
 
Property / cites work: Survey of Bundle Methods for Nonsmooth Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Atomic Decomposition by Basis Pursuit / rank
 
Normal rank
Property / cites work
 
Property / cites work: Comparison of formulations and solution methods for image restoration problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4004158 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Monotone Operators and the Proximal Point Algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence analysis of some methods for minimizing a nonsmooth convex function / rank
 
Normal rank
Property / cites work
 
Property / cites work: A family of variable metric proximal methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence analysis of a proximal newton method<sup>1</sup> / rank
 
Normal rank
Property / cites work
 
Property / cites work: THE BARZILAI AND BORWEIN GRADIENT METHOD WITH NONMONOTONE LINE SEARCH FOR NONSMOOTH CONVEX OPTIMIZATION PROBLEMS / rank
 
Normal rank
Property / cites work
 
Property / cites work: A trust region method for nonsmooth convex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Proximity control in bundle methods for convex nondifferentiable minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A bundle-Newton method for nonsmooth unconstrained minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Methods of descent for nondifferentiable optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities / rank
 
Normal rank
Property / cites work
 
Property / cites work: New limited memory bundle method for large-scale nonsmooth optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: TWO MODIFIED HYBRID CONJUGATE GRADIENT METHODS BASED ON A HYBRID SECANT EQUATION / rank
 
Normal rank
Property / cites work
 
Property / cites work: Methods of conjugate gradients for solving linear systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5563083 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficient generalized conjugate gradient algorithms. I: Theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Function minimization by conjugate gradients / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3882253 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5479892 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs / rank
 
Normal rank
Property / cites work
 
Property / cites work: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations / rank
 
Normal rank
Property / cites work
 
Property / cites work: A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: A new smoothing modified three-term conjugate gradient method for \(l_1\)-norm minimization problem / rank
 
Normal rank
Property / cites work
 
Property / cites work: The smoothing FR conjugate gradient method for solving a kind of nonsmooth optimization problem with \(l_1\)-norm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence of some algorithms for convex minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3141900 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Projected gradient methods for linearly constrained problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimization and nonsmooth analysis / rank
 
Normal rank
Property / cites work
 
Property / cites work: The conjugate gradient method in extremal problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: A descent algorithm for nonsmooth convex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Benchmarking optimization software with performance profiles. / rank
 
Normal rank

Revision as of 07:13, 22 July 2024

scientific article
Language Label Description Also known as
English
A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization
scientific article

    Statements

    A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    7 April 2020
    0 references
    Nonlinear conjugate gradient methods (CG) are among the most preferred for solving large-scale smooth problems because of their simplicity and low memory requirement. This paper presents the modified CG method, which includes the advantages of the CG methods for solving large-scale nonsmooth convex optimization problems. This method has the strong global convergence property of Dai and Yuan (DY) method and numerical efficiency of Hestenes and Stiefel (HS) method. The new method uses the search direction, which generates sufficient descent property and belongs to a trust region. Numerical results indicate that the method performs very well.
    0 references
    conjugate gradient method
    0 references
    Moreau-Yosida regularization
    0 references
    nonsmooth large-scale problems
    0 references
    global convergence
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references