An efficient gradient method using the Yuan steplength
From MaRDI portal
Publication:480934
DOI10.1007/s10589-014-9669-5zbMath1310.90082OpenAlexW2109062572WikidataQ58832753 ScholiaQ58832753MaRDI QIDQ480934
Hongchao Zhang, Roberta De Asmundis, William W. Hager, Gerardo Toraldo, Daniela di Serafino
Publication date: 12 December 2014
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-014-9669-5
Related Items
On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems, On the application of the spectral projected gradient method in image segmentation, An accelerated minimal gradient method with momentum for strictly convex quadratic optimization, On initial point selection of the steepest descent algorithm for general quadratic functions, On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients, On the worst case performance of the steepest descent algorithm for quadratic functions, Split Bregman iteration for multi-period mean variance portfolio optimization, Regularized quadratic penalty methods for shape from shading, Delayed Gradient Methods for Symmetric and Positive Definite Linear Systems, A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization, Adaptive \(l_1\)-regularization for short-selling control in portfolio selection, Analysis of the Barzilai-Borwein step-sizes for problems in Hilbert spaces, Gradient method with multiple damping for large-scale unconstrained optimization, On the steplength selection in gradient methods for unconstrained optimization, A gradient method exploiting the two dimensional quadratic termination property, A new steplength selection for scaled gradient methods with application to image deblurring, Spectral Properties of Barzilai--Borwein Rules in Solving Singly Linearly Constrained Optimization Problems Subject to Lower and Upper Bounds, Fast gradient methods with alignment for symmetric linear systems without using Cauchy step, On the Preconditioned Delayed Weighted Gradient Method, A family of modified spectral projection methods for nonlinear monotone equations with convex constraint, Gradient methods exploiting spectral properties, Variable metric techniques for forward-backward methods in imaging, New stepsizes for the gradient method, A cyclic block coordinate descent method with generalized gradient projections, A second-order gradient method for convex minimization, Fused Lasso approach in portfolio selection, A coordinate descent method for total variation minimization, A generalized eigenvalues classifier with embedded feature selection, Steplength selection in gradient projection methods for box-constrained quadratic programs, A new nonmonotone trust region Barzilai-Borwein method for unconstrained optimization problems, Properties of the delayed weighted gradient method, A Two-Phase Gradient Method for Quadratic Programming Problems with a Single Linear Constraint and Bounds on the Variables, A delayed weighted gradient method for strictly convex quadratic minimization, Reconstruction of 3D X-ray CT images from reduced sampling by a scaled gradient projection algorithm, ACQUIRE: an inexact iteratively reweighted norm approach for TV-based Poisson image restoration, On the steepest descent algorithm for quadratic functions, On the asymptotic convergence and acceleration of gradient methods, On \(R\)-linear convergence analysis for a class of gradient methods, A family of spectral gradient methods for optimization, Semi-supervised generalized eigenvalues classification, Gravity-magnetic cross-gradient joint inversion by the cyclic gradient method, Equipping the Barzilai--Borwein Method with the Two Dimensional Quadratic Termination Property, Hybrid limited memory gradient projection methods for box-constrained optimization problems, On the acceleration of the Barzilai-Borwein method
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems
- The chaotic nature of faster gradient descent methods
- A limited memory steepest descent method
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- Gradient methods with adaptive step-sizes
- New adaptive stepsize selections in gradient methods
- Algorithms for bound constrained quadratic programming problems
- Smooth and adaptive gradient method with retards
- On the behavior of the gradient norm in the steepest descent method
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- Analysis of monotone gradient methods
- R-linear convergence of the Barzilai and Borwein gradient method
- On spectral properties of steepest descent methods
- Scaling techniques for gradient projection-type methods in astronomical image deblurring
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- On the Identification Property of a Projected Gradient Method
- Two-Point Step Size Gradient Methods
- Gradient Method with Retards and Generalizations
- Alternate minimization gradient method
- Alternate step gradient method*
- A Nonmonotone Line Search Technique for Newton’s Method
- On the Barzilai and Borwein choice of steplength for the gradient method
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method