Efficient Byzantine-robust distributed inference with regularization: a trade-off between compression and adversary
From MaRDI portal
Publication:6571196
DOI10.1016/J.INS.2024.121010MaRDI QIDQ6571196FDOQ6571196
Authors: Xingcai Zhou, Guang Yang, Le Chang, Shao-Gao Lv
Publication date: 11 July 2024
Published in: Information Sciences (Search for Journal in Brave)
compressiondistributed learningadversarycommunication-efficientByzantine-robuststatistical error rate
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Distributed testing and estimation under sparse high dimensional models
- Title not available (Why is that?)
- Communication-Efficient Distributed Statistical Inference
- The landscape of empirical risk for nonconvex losses
- Distributed secure state estimation for cyber-physical systems under sensor attacks
- Byzantine-resilient distributed state estimation: a min-switching approach
- Fault-Tolerant Multi-Agent Optimization
- Title not available (Why is that?)
This page was built for publication: Efficient Byzantine-robust distributed inference with regularization: a trade-off between compression and adversary
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6571196)