A randomized coordinate descent method with volume sampling
From MaRDI portal
Publication:3300772
Abstract: We analyze the coordinate descent method with a new coordinate selection strategy, called volume sampling. This strategy prescribes selecting subsets of variables of certain size proportionally to the determinants of principal submatrices of the matrix, that bounds the curvature of the objective function. In the particular case, when the size of the subsets equals one, volume sampling coincides with the well-known strategy of sampling coordinates proportionally to their Lipschitz constants. For the coordinate descent with volume sampling, we establish the convergence rates both for convex and strongly convex problems. Our theoretical results show that, by increasing the size of the subsets, it is possible to accelerate the method up to the factor which depends on the spectral gap between the corresponding largest eigenvalues of the curvature matrix. Several numerical experiments confirm our theoretical conclusions.
Recommendations
- Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
- A flexible coordinate descent method
- Coordinate descent with arbitrary sampling. II: Expected separable overapproximation.
- On optimal probabilities in stochastic coordinate descent methods
- Efficiency of coordinate descent methods on huge-scale optimization problems
Cites work
- scientific article; zbMATH DE number 1460605 (Why is no real title available?)
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Coordinate descent algorithms
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Efficiency of the accelerated coordinate descent method on structured optimization problems
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Matrix approximation and projective clustering via volume sampling
- On the convergence of block coordinate descent type methods
- Optimization in high dimensions via accelerated, parallel, and proximal coordinate descent
- Parallel coordinate descent methods for big data optimization
- Parallel random coordinate descent method for composite minimization: convergence analysis and error bounds
- Random block coordinate descent methods for linearly constrained optimization over networks
- Reverse iterative volume sampling for linear regression
- Smooth minimization of non-smooth functions
- Smoothing and first order methods: a unified framework
- Stochastic dual coordinate ascent methods for regularized loss minimization
Cited in
(3)
This page was built for publication: A randomized coordinate descent method with volume sampling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3300772)