Subsampled nonmonotone spectral gradient methods
From MaRDI portal
Publication:2178981
Abstract: This paper deals with subsampled spectral gradient methods for minimizing finite sum. Subsample function and gradient approximations are employed in order to reduce the overall computational cost of the classical spectral gradient methods. The global convergence is enforced by a nonmonotone line search procedure. Global convergence is proved when functions and gradients are approximated with increasing accuracy. R-linear convergence and worst-case iteration complexity is investigated in case of strongly convex objective function. Numerical results on well known binary classification problems are given to show the effectiveness of this framework and analyze the effect of different spectral coefficient approximations arising from variable sample nature of this procedure.
Recommendations
Cites work
- scientific article; zbMATH DE number 2221955 (Why is no real title available?)
- A Convergent Incremental Gradient Method with a Constant Step Size
- A derivative-free line search and global convergence of Broyden-like method for nonlinear equations
- Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization
- Adaptive sampling strategies for stochastic optimization
- An investigation of Newton-sketch and subsampled Newton methods
- Hybrid deterministic-stochastic methods for data fitting
- Nonmonotone line search methods with variable sample size
- On the employment of inexact restoration for the minimization of functions whose evaluation is subject to errors
- On the steplength selection in gradient methods for unconstrained optimization
- On the worst-case evaluation complexity of non-monotone line search algorithms
- Optimization methods for large-scale machine learning
- Sample size selection in optimization methods for machine learning
- Spectral projected gradient method for stochastic optimization
- Sub-sampled Newton methods
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- Two-Point Step Size Gradient Methods
Cited in
(10)- Subsampled Hessian Newton Methods for Supervised Learning
- Optimal sample complexity of subgradient descent for amplitude flow via non-Lipschitz matrix concentration
- A generalized worst-case complexity analysis for non-monotone line searches
- AN-SPS: adaptive sample size nonmonotone line search spectral projected subgradient method for convex constrained optimization problems
- Subsampled first-order optimization methods with applications in imaging
- Subsampled inexact Newton methods for minimizing large sums of convex functions
- A CLASS OF NONMONOTONE SPECTRAL MEMORY GRADIENT METHOD
- Nomonotone spectral gradient method for sparse recovery
- Spectral projected subgradient method for nonsmooth convex optimization problems
- Convergence analysis of a subsampled Levenberg-Marquardt algorithm
This page was built for publication: Subsampled nonmonotone spectral gradient methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2178981)