On the convergence of a block-coordinate incremental gradient method
From MaRDI portal
Publication:2100401
DOI10.1007/s00500-021-05695-4zbMath1498.90157OpenAlexW3135096334MaRDI QIDQ2100401
Publication date: 22 November 2022
Published in: Soft Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00500-021-05695-4
Cites Work
- Unnamed Item
- Unnamed Item
- An incremental decomposition method for unconstrained optimization
- On the complexity analysis of randomized block-coordinate descent methods
- Incremental gradient algorithms with stepsizes bounded away from zero
- Block layer decomposition schemes for training deep neural networks
- Coordinate descent algorithms
- Convergent Decomposition Techniques for Training RBF Neural Networks
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Large-Scale Machine Learning with Stochastic Gradient Descent
- Globally convergent block-coordinate techniques for unconstrained optimization
- Gradient Convergence in Gradient methods with Errors
- Optimization Methods for Large-Scale Machine Learning
- Incremental Least Squares Methods and the Extended Kalman Filter
- On the Convergence of Block Coordinate Descent Type Methods
- A Convergent Incremental Gradient Method with a Constant Step Size
- A Stochastic Approximation Method
- Logistic regression, AdaBoost and Bregman distances