Uncertainty principle for communication compression in distributed and federated learning and the search for an optimal compressor
DOI10.1093/IMAIAI/IAAB006zbMATH Open1494.94003arXiv2002.08958OpenAlexW3154043723MaRDI QIDQ5095263FDOQ5095263
Authors: Mher Safaryan, Egor Shulgin, Peter Richtárik
Publication date: 5 August 2022
Published in: Information and Inference: A Journal of the IMA (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2002.08958
Recommendations
- Stochastic distributed learning with gradient quantization and double-variance reduction
- scientific article; zbMATH DE number 6866309
- Deterministic compression with uncertain priors
- Communication lower bounds for statistical estimation problems via a distributed data processing inequality
- Deterministic compression with uncertain priors
Learning and adaptive systems in artificial intelligence (68T05) Communication theory (94A05) Coding and information theory (compaction, compression, models of communication, encoding schemes, etc.) (aspects in computer science) (68P30)
Cited In (2)
This page was built for publication: Uncertainty principle for communication compression in distributed and federated learning and the search for an optimal compressor
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5095263)