Distributed block-diagonal approximation methods for regularized empirical risk minimization
Publication:782443
DOI10.1007/s10994-019-05859-2zbMath1496.68276arXiv1709.03043OpenAlexW3102764059WikidataQ126546385 ScholiaQ126546385MaRDI QIDQ782443
Publication date: 27 July 2020
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1709.03043
Classification and discrimination; cluster analysis (statistical aspects) (62H30) General nonlinear regression (62J02) Convex programming (90C25) Learning and adaptive systems in artificial intelligence (68T05) Approximation methods and heuristics in mathematical programming (90C59) Distributed algorithms (68W15)
Related Items (2)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Dual coordinate descent methods for logistic regression and maximum entropy models
- A coordinate gradient descent method for nonsmooth separable minimization
- On gradients of functions definable in o-minimal structures
- On semi- and subanalytic geometry
- From error bounds to the complexity of first-order descent methods for convex functions
- Cutting-plane training of structural SVMs
- Linear convergence of first order methods for non-strongly convex optimization
- Inexact successive quadratic approximation for regularized optimization
- Proximal Newton-Type Methods for Minimizing Composite Functions
- Duality Between Subgradient and Conditional Gradient Methods
- The Conjugate Gradient Method and Trust Regions in Large Scale Optimization
- A General Distributed Dual Coordinate Optimization Framework for Regularized Loss Minimization
- Analyzing random permutations for cyclic coordinate descent
- Regularization and Variable Selection Via the Elastic Net
- A Study on L2-Loss (Squared Hinge-Loss) Multiclass SVM
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Error bounds for convolutional codes and an asymptotically optimum decoding algorithm
- Optimum branchings
- Convex Analysis
- Random permutations fix a worst case for cyclic coordinate descent
- On the learnability and design of output codes for multiclass problems
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: Distributed block-diagonal approximation methods for regularized empirical risk minimization