Subsampled nonmonotone spectral gradient methods
DOI10.2478/CAIM-2020-0002zbMATH Open1439.49056arXiv1812.06822OpenAlexW3003832934MaRDI QIDQ2178981FDOQ2178981
Greta Malaspina, Stefania Bellavia, Nataša Krklec Jerinkić
Publication date: 12 May 2020
Published in: Communications in Applied and Industrial Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1812.06822
Recommendations
Numerical mathematical programming methods (65K05) Learning and adaptive systems in artificial intelligence (68T05) Analysis of algorithms (68W40) Numerical methods based on nonlinear programming (49M37)
Cites Work
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Sample size selection in optimization methods for machine learning
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- A Convergent Incremental Gradient Method with a Constant Step Size
- On the worst-case evaluation complexity of non-monotone line search algorithms
- A derivative-free line search and global convergence of Broyden-like method for nonlinear equations
- Title not available (Why is that?)
- On the steplength selection in gradient methods for unconstrained optimization
- Nonmonotone line search methods with variable sample size
- Hybrid deterministic-stochastic methods for data fitting
- Spectral projected gradient method for stochastic optimization
- Optimization Methods for Large-Scale Machine Learning
- Adaptive Sampling Strategies for Stochastic Optimization
- Sub-sampled Newton methods
- An investigation of Newton-Sketch and subsampled Newton methods
- Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization
- On the employment of inexact restoration for the minimization of functions whose evaluation is subject to errors
Cited In (7)
- Nomonotone spectral gradient method for sparse recovery
- Spectral projected subgradient method for nonsmooth convex optimization problems
- A CLASS OF NONMONOTONE SPECTRAL MEMORY GRADIENT METHOD
- A generalized worst-case complexity analysis for non-monotone line searches
- Optimal sample complexity of subgradient descent for amplitude flow via non-Lipschitz matrix concentration
- AN-SPS: adaptive sample size nonmonotone line search spectral projected subgradient method for convex constrained optimization problems
- Subsampled Hessian Newton Methods for Supervised Learning
Uses Software
This page was built for publication: Subsampled nonmonotone spectral gradient methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2178981)