New adaptive stepsize selections in gradient methods
From MaRDI portal
Recommendations
Cited in
(67)- Improving the global convergence of inexact restoration methods for constrained optimization problems
- New gradient methods with adaptive stepsizes by approximate models
- Cyclic gradient methods for unconstrained optimization
- Barzilai–Borwein-like rules in proximal gradient schemes for ℓ 1 -regularized problems
- AN-SPS: adaptive sample size nonmonotone line search spectral projected subgradient method for convex constrained optimization problems
- Delayed weighted gradient method with simultaneous step-sizes for strongly convex optimization
- A low-cost optimization approach for solving minimum norm linear systems and linear least-squares problems
- Shearlet-based regularization in statistical inverse learning with an application to x-ray tomography
- A gradient method exploiting the two dimensional quadratic termination property
- A stochastic variance reduced gradient method with adaptive step for stochastic optimization
- On the stationarity for nonlinear optimization problems with polyhedral constraints
- A homogeneous Rayleigh quotient with applications in gradient methods
- ACQUIRE: an inexact iteratively reweighted norm approach for TV-based Poisson image restoration
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- Runge-Kutta-like scaling techniques for first-order methods in convex optimization
- A comparison of edge-preserving approaches for differential interference contrast microscopy
- On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems
- An extended delayed weighted gradient algorithm for solving strongly convex optimization problems
- Comparison of active-set and gradient projection-based algorithms for box-constrained quadratic programming
- A convergent least-squares regularized blind deconvolution approach
- Numerical methods for parameter estimation in Poisson data inversion
- New stepsizes for the gradient method
- A family of spectral gradient methods for optimization
- Accelerating gradient projection methods for \(\ell _1\)-constrained signal recovery by steplength selection rules
- On the asymptotic convergence and acceleration of gradient methods
- Variable metric techniques for forward-backward methods in imaging
- Delayed gradient methods for symmetric and positive definite linear systems
- Solving nonlinear systems of equations via spectral residual methods: stepsize selection and applications
- On \(R\)-linear convergence analysis for a class of gradient methods
- scientific article; zbMATH DE number 7306906 (Why is no real title available?)
- Geometrical inverse matrix approximation for least-squares problems and acceleration strategies
- Randomized algorithms for high quality treatment planning in volumetric modulated arc therapy
- On the acceleration of the Barzilai-Borwein method
- Iterative regularization algorithms for constrained image deblurring on graphics processors
- Convergence of inexact forward-backward algorithms using the forward-backward envelope
- A nonsmooth regularization approach based on shearlets for Poisson noise removal in ROI tomography
- Modified subspace Barzilai-Borwein gradient method for non-negative matrix factorization
- Scaling techniques for gradient projection-type methods in astronomical image deblurring
- Scaled diagonal gradient-type method with extra update for large-scale unconstrained optimization
- Hybrid limited memory gradient projection methods for box-constrained optimization problems
- A second-order gradient method for convex minimization
- A new steplength selection for scaled gradient methods with application to image deblurring
- A delayed weighted gradient method for strictly convex quadratic minimization
- Scaling techniques for \(\epsilon\)-subgradient methods
- A two-phase gradient method for quadratic programming problems with a single linear constraint and bounds on the variables
- On some curvature-dependent steplength for the gradient method
- A scaled gradient projection method for Bayesian learning in dynamical systems
- A cyclic projected gradient method
- On the rate of convergence of projected Barzilai-Borwein methods
- A harmonic framework for stepsize selection in gradient methods
- An efficient gradient method using the Yuan steplength
- An accelerated minimal gradient method with momentum for strictly convex quadratic optimization
- On the steplength selection in gradient methods for unconstrained optimization
- Gradient methods exploiting spectral properties
- On projected alternating BB methods for variational inequalities
- An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem
- Shearlet-based regularized reconstruction in region-of-interest computed tomography
- Reconstruction of 3D X-ray CT images from reduced sampling by a scaled gradient projection algorithm
- Ritz-like values in steplength selections for stochastic gradient methods
- Equipping the Barzilai-Borwein method with the two dimensional quadratic termination property
- Gradient methods with adaptive step-sizes
- Steplength selection in gradient projection methods for box-constrained quadratic programs
- A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization
- Fast gradient methods with alignment for symmetric linear systems without using Cauchy step
- Adaptive two-point stepsize gradient algorithm
- Inexact Bregman iteration for deconvolution of superimposed extended and point sources
- Spectral properties of Barzilai-Borwein rules in solving singly linearly constrained optimization problems subject to lower and upper bounds
This page was built for publication: New adaptive stepsize selections in gradient methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1008801)