A limited memory steepest descent method
From MaRDI portal
Publication:715093
DOI10.1007/s10107-011-0479-6zbMath1254.90113OpenAlexW2093575660MaRDI QIDQ715093
Publication date: 15 October 2012
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-011-0479-6
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26)
Related Items
On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems, On the application of the spectral projected gradient method in image segmentation, Ritz-like values in steplength selections for stochastic gradient methods, A hybrid quasi-Newton projected-gradient method with application to lasso and basis-pursuit denoising, Cooperative concurrent asynchronous computation of the solution of symmetric linear systems, Delayed Gradient Methods for Symmetric and Positive Definite Linear Systems, A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization, Numerical methods for parameter estimation in Poisson data inversion, Analysis of the Barzilai-Borwein step-sizes for problems in Hilbert spaces, Gradient method with multiple damping for large-scale unconstrained optimization, On the steplength selection in gradient methods for unconstrained optimization, A new steplength selection for scaled gradient methods with application to image deblurring, A comparison of edge-preserving approaches for differential interference contrast microscopy, Convergence of Inexact Forward--Backward Algorithms Using the Forward--Backward Envelope, An efficient gradient method using the Yuan steplength, New stepsizes for the gradient method, A cyclic block coordinate descent method with generalized gradient projections, A second-order gradient method for convex minimization, A coordinate descent method for total variation minimization, Asymptotic rate of convergence of a two-layer iterative method of the variational type, Steplength selection in gradient projection methods for box-constrained quadratic programs, ACQUIRE: an inexact iteratively reweighted norm approach for TV-based Poisson image restoration, Scaling Techniques for $\epsilon$-Subgradient Methods, On Quasi-Newton Forward-Backward Splitting: Proximal Calculus and Convergence, Gravity-magnetic cross-gradient joint inversion by the cyclic gradient method, Special issue for SIMAI 2020-2021: large-scale optimization and applications, Hybrid limited memory gradient projection methods for box-constrained optimization problems, On the acceleration of the Barzilai-Borwein method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Representations of quasi-Newton matrices and their use in limited memory methods
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- Analysis of monotone gradient methods
- On the asymptotic behaviour of some new gradient methods
- Preconditioned Barzilai-Borwein method for the numerical solution of partial differential equations
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Updating Quasi-Newton Matrices with Limited Storage
- Some Numerical Results Using a Sparse Matrix Updating Formula in Unconstrained Optimization
- Gradient Method with Retards and Generalizations
- A Nonmonotone Line Search Technique for Newton’s Method
- Approximate solutions and eigenvalue bounds from Krylov subspaces
- On the Barzilai and Borwein choice of steplength for the gradient method
- Gradient projection methods for quadratic programs and applications in training support vector machines
- A Rapidly Convergent Descent Method for Minimization
- CUTEr and SifDec
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method