Preconditioning meets biased compression for efficient distributed optimization
From MaRDI portal
Publication:6149587
DOI10.1007/S10287-023-00496-6OpenAlexW4390154726MaRDI QIDQ6149587FDOQ6149587
Authors: Vitali Pirau, Aleksandr Beznosikov, M. Takáč, A. V. Gasnikov
Publication date: 6 February 2024
Published in: Computational Management Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10287-023-00496-6
Recommendations
- Stochastic gradient methods with preconditioned updates
- Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization
- Stochastic distributed learning with gradient quantization and double-variance reduction
- Communication-efficient distributed optimization of self-concordant empirical loss
- On data preconditioning for regularized loss minimization
Cites Work
Cited In (3)
This page was built for publication: Preconditioning meets biased compression for efficient distributed optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6149587)