Byzantine-robust loopless stochastic variance-reduced gradient
From MaRDI portal
Publication:6134045
DOI10.1007/978-3-031-35305-5_3zbMATH Open1528.90189arXiv2303.04560OpenAlexW4381956520MaRDI QIDQ6134045FDOQ6134045
Publication date: 21 August 2023
Published in: Mathematical Optimization Theory and Operations Research (Search for Journal in Brave)
Abstract: Distributed optimization with open collaboration is a popular field since it provides an opportunity for small groups/companies/universities, and individuals to jointly solve huge-scale problems. However, standard optimization algorithms are fragile in such settings due to the possible presence of so-called Byzantine workers -- participants that can send (intentionally or not) incorrect information instead of the one prescribed by the protocol (e.g., send anti-gradient instead of stochastic gradients). Thus, the problem of designing distributed methods with provable robustness to Byzantine workers has been receiving a lot of attention recently. In particular, several works consider a very promising way to achieve Byzantine tolerance via exploiting variance reduction and robust aggregation. The existing approaches use SAGA- and SARAH-type variance-reduced estimators, while another popular estimator -- SVRG -- is not studied in the context of Byzantine-robustness. In this work, we close this gap in the literature and propose a new method -- Byzantine-Robust Loopless Stochastic Variance Reduced Gradient (BR-LSVRG). We derive non-asymptotic convergence guarantees for the new method in the strongly convex case and compare its performance with existing approaches in numerical experiments.
Full work available at URL: https://arxiv.org/abs/2303.04560
Cites Work
- A Stochastic Approximation Method
- Title not available (Why is that?)
- Robust Stochastic Approximation Approach to Stochastic Programming
- Understanding Machine Learning
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Title not available (Why is that?)
- The Byzantine Generals Problem
- Some methods of speeding up the convergence of iteration methods
- Title not available (Why is that?)
- Lectures on convex optimization
- Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization
- Brief Announcement
- Federated Variance-Reduced Stochastic Gradient Descent With Robustness to Byzantine Attacks
Cited In (1)
This page was built for publication: Byzantine-robust loopless stochastic variance-reduced gradient
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6134045)