Block-cyclic stochastic coordinate descent for deep neural networks
From MaRDI portal
Publication:6054553
DOI10.1016/j.neunet.2021.04.001zbMath1521.68195arXiv1711.07190OpenAlexW3156338641MaRDI QIDQ6054553
Stefano Soatto, Kensuke Nakamura, Byung-Woo Hong
Publication date: 28 September 2023
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1711.07190
Artificial neural networks and deep learning (68T07) Applications of mathematical programming (90C90) Stochastic programming (90C15)
Cites Work
- Parallel coordinate descent methods for big data optimization
- Adaptive stepsizes for recursive estimation with applications in approximate dynamic programming
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- Model Selection and Estimation in Regression with Grouped Variables
- A Stochastic Approximation Method
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Block-cyclic stochastic coordinate descent for deep neural networks