Adaptive step size rules for stochastic optimization in large-scale learning
From MaRDI portal
Publication:6116586
DOI10.1007/s11222-023-10218-2zbMath1516.62033MaRDI QIDQ6116586
No author found.
Publication date: 18 July 2023
Published in: Statistics and Computing (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Stochastic Quasi-Newton Method for Large-Scale Optimization
- Penalized likelihood regression for generalized linear models with non-quadratic penalties
- Adaptive stochastic gradient descent optimisation for image registration
- Introductory lectures on convex optimization. A basic course.
- Stochastic gradient descent with Barzilai-Borwein update step for SVM
- Rates of convergence of adaptive step-size of stochastic approximation algorithms
- Importance sampling in stochastic optimization: an application to intertemporal portfolio choice
- Control variates for stochastic gradient MCMC
- Smoothing projected Barzilai-Borwein method for constrained non-Lipschitz optimization
- Adaptivity of averaged stochastic gradient descent to local strong convexity for logistic regression
- Accelerated Stochastic Approximation
- Two-Point Step Size Gradient Methods
- Accelerated Stochastic Approximation
- Probabilistic Line Searches for Stochastic Optimization
- Spectral Properties of Barzilai--Borwein Rules in Solving Singly Linearly Constrained Optimization Problems Subject to Lower and Upper Bounds
- A Stochastic Line Search Method with Expected Complexity Analysis
- Stochastic (Approximate) Proximal Point Methods: Convergence, Optimality, and Adaptivity
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Learning Applied to Successive Approximation Algorithms
- A Stochastic Approximation Method
- Scalable estimation strategies based on stochastic approximations: classical results and new insights