More communication-efficient distributed sparse learning
From MaRDI portal
Publication:6546022
DOI10.1016/J.INS.2024.120523MaRDI QIDQ6546022FDOQ6546022
Authors: Xingcai Zhou, Guang Yang
Publication date: 29 May 2024
Published in: Information Sciences (Search for Journal in Brave)
distributed learningerror feedbackcommunication efficientgradient sparseindependent coordinate blocktop \(k\)
Cites Work
- Title not available (Why is that?)
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Title not available (Why is that?)
- Classification of gene microarrays by penalized logistic regression
- Distributed coordinate descent method for learning with big data
- Accelerated, parallel, and proximal coordinate descent
- Communication-efficient sparse regression
- Distributed testing and estimation under sparse high dimensional models
- Communication-efficient algorithms for statistical optimization
- Title not available (Why is that?)
- Communication-efficient distributed statistical inference
- Communication-efficient algorithms for decentralized and stochastic optimization
- A distributed block coordinate descent method for training \(l_1\) regularized linear classifiers
- First-Order Newton-Type Estimator for Distributed Estimation and Inference
- Sign stochastic gradient descents without bounded gradient assumption for the finite sum minimization
- Byzantine-robust variance-reduced federated learning over distributed non-i.i.d. data
- Communication-efficient surrogate quantile regression for non-randomly distributed system
This page was built for publication: More communication-efficient distributed sparse learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6546022)