An adaptive Polyak heavy-ball method
From MaRDI portal
Publication:2102380
DOI10.1007/s10994-022-06215-7OpenAlexW4285801387WikidataQ114224928 ScholiaQ114224928MaRDI QIDQ2102380
Publication date: 28 November 2022
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-022-06215-7
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- ImageNet
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- A literature survey of benchmark functions for global optimisation problems
- iPiasco: inertial proximal algorithm for strongly convex optimization
- Introductory lectures on convex optimization. A basic course.
- Stochastic heavy ball
- Stochastic gradient descent with Polyak's learning rate
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
- Automated adaptation strategies for stream learning
- iPiano: Inertial Proximal Algorithm for Nonconvex Optimization
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- A New Class of Incremental Gradient Methods for Least Squares Problems
- First-Order Methods in Optimization
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Some methods of speeding up the convergence of iteration methods
- A Stochastic Approximation Method
- Heavy-ball method in nonconvex optimization problems
This page was built for publication: An adaptive Polyak heavy-ball method