A flexible coordinate descent method
From MaRDI portal
Abstract: We present a novel randomized block coordinate descent method for the minimization of a convex composite objective function. The method uses (approximate) partial second-order (curvature) information, so that the algorithm performance is more robust when applied to highly nonseparable or ill conditioned problems. We call the method Flexible Coordinate Descent (FCD). At each iteration of FCD, a block of coordinates is sampled randomly, a quadratic model is formed about that block and the model is minimized emph{approximately/inexactly} to determine the search direction. An inexpensive line search is then employed to ensure a monotonic decrease in the objective function and acceptance of large step sizes. We present several high probability iteration complexity results to show that convergence of FCD is guaranteed theoretically. Finally, we present numerical results on large-scale problems to demonstrate the practical performance of the method.
Recommendations
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Efficiency of coordinate descent methods on huge-scale optimization problems
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- On the complexity analysis of randomized block-coordinate descent methods
Cites work
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- A coordinate gradient descent method for nonsmooth separable minimization
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- Accelerated block-coordinate relaxation for regularized optimization
- Accelerated, parallel, and proximal coordinate descent
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- Compressed sensing
- Compressive sampling
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Efficient block-coordinate descent algorithms for the group Lasso
- First-order methods of smooth convex optimization with inexact oracle
- Hybrid Random/Deterministic Parallel Algorithms for Convex and Nonconvex Big Data Optimization
- IMRO: A proximal quasi-Newton method for solving \(\ell_1\)-regularized least squares problems
- Inexact coordinate descent: complexity and preconditioning
- Introductory lectures on convex optimization. A basic course.
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Nonsmooth mechanics and analysis. Theoretical and numerical advances
- Numerical Optimization
- On the complexity analysis of randomized block-coordinate descent methods
- On the complexity of parallel coordinate descent
- On the convergence of inexact block coordinate descent methods for constrained optimization
- Parallel Selective Algorithms for Nonconvex Big Data Optimization
- Parallel coordinate descent methods for big data optimization
- Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Proximal Newton-type methods for minimizing composite functions
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Standardization and the group lasso penalty
- Stochastic dual coordinate ascent methods for regularized loss minimization
Cited in
(19)- Inexact variable metric stochastic block-coordinate descent for regularized optimization
- Inexact coordinate descent: complexity and preconditioning
- A coordinate descent method for total variation minimization
- scientific article; zbMATH DE number 4078635 (Why is no real title available?)
- Accelerating block coordinate descent methods with identification strategies
- scientific article; zbMATH DE number 966589 (Why is no real title available?)
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- Linear convergence of randomized feasible descent methods under the weak strong convexity assumption
- Semi-stochastic coordinate descent
- Second order semi-smooth proximal Newton methods in Hilbert spaces
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- On the complexity analysis of randomized block-coordinate descent methods
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- Convergence analysis of inexact randomized iterative methods
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- A randomized coordinate descent method with volume sampling
- Distributed block coordinate descent for minimizing partially separable functions
This page was built for publication: A flexible coordinate descent method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1639710)