DiSCO
From MaRDI portal
Software:40153
swMATH28439MaRDI QIDQ40153FDOQ40153
Author name not available (Why is that?)
Cited In (15)
- Composite convex optimization with global and local inexact oracles
- Graph-dependent implicit regularisation for distributed stochastic subgradient descent
- Finite-sample analysis of \(M\)-estimators using self-concordance
- Generalized self-concordant functions: a recipe for Newton-type methods
- Title not available (Why is that?)
- Distributed Optimization Based on Gradient Tracking Revisited: Enhancing Convergence Rate via Surrogation
- Distributed adaptive Newton methods with global superlinear convergence
- Communication-efficient distributed statistical inference
- Distributed stochastic variance reduced gradient methods by sampling extra data with replacement
- Limited-memory common-directions method for large-scale optimization: convergence, parallelization, and distributed optimization
- Distributed block-diagonal approximation methods for regularized empirical risk minimization
- Generalized self-concordant analysis of Frank-Wolfe algorithms
- Parallelizing stochastic gradient descent for least squares regression: mini-batching, averaging, and model misspecification
- DSCOVR: randomized primal-dual block coordinate algorithms for asynchronous distributed optimization
- A general distributed dual coordinate optimization framework for regularized loss minimization
This page was built for software: DiSCO