A generic coordinate descent solver for non-smooth convex optimisation
From MaRDI portal
Publication:5865339
Abstract: We present a generic coordinate descent solver for the minimization of a nonsmooth convex objective with structure. The method can deal in particular with problems with linear constraints. The implementation makes use of efficient residual updates and automatically determines which dual variables should be duplicated. A list of basic functional atoms is pre-compiled for efficiency and a modelling language in Python allows the user to combine them at run time. So, the algorithm can be used to solve a large variety of problems including Lasso, sparse multinomial logistic regression, linear and quadratic programs.
Recommendations
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- A coordinate gradient descent method for nonsmooth separable minimization
- A descent algorithm for nonsmooth convex optimization
- A coordinate descent homotopy method for linearly constrained nonsmooth convex minimization
- Publication:3479818
- scientific article; zbMATH DE number 7109378
- scientific article; zbMATH DE number 7366152
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- A descent nonlinear conjugate gradient method for large-scale unconstrained optimization
Cites work
- scientific article; zbMATH DE number 3511879 (Why is no real title available?)
- scientific article; zbMATH DE number 6253925 (Why is no real title available?)
- A New Randomized Block-Coordinate Primal-Dual Proximal Algorithm for Distributed Optimization
- A block coordinate descent method for regularized multiconvex optimization with applications to nonnegative tensor factorization and completion
- A block successive upper-bound minimization method of multipliers for linearly constrained convex optimization
- A coordinate gradient descent method for nonsmooth separable minimization
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- A smooth primal-dual optimization framework for nonsmooth composite convex minimization
- A unified convergence analysis of block successive minimization methods for nonsmooth optimization
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- Accelerated, parallel, and proximal coordinate descent
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- CVXPY: a Python-embedded modeling language for convex optimization
- Coordinate descent algorithms
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Distributed coordinate descent method for learning with big data
- Dual coordinate descent methods for logistic regression and maximum entropy models
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- LIBLINEAR: a library for large linear classification
- Minimizing Certain Convex Functions
- On optimal probabilities in stochastic coordinate descent methods
- On the complexity analysis of randomized block-coordinate descent methods
- On the complexity analysis of the primal solutions for the accelerated randomized dual coordinate ascent
- On the convergence of block coordinate descent type methods
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- On the convergence of the coordinate descent method for convex differentiable minimization
- On the iteration complexity of cyclic coordinate gradient descent methods
- Parallel coordinate descent methods for big data optimization
- Pathwise coordinate optimization
- Randomized methods for linear constraints: convergence rates and conditioning
- Randomized primal-dual proximal block coordinate updates
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Scikit-learn: machine learning in Python
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- SparseNet: coordinate descent with nonconvex penalties
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Stochastic dual coordinate ascent methods for regularized loss minimization
- Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping
Cited in
(2)
This page was built for publication: A generic coordinate descent solver for non-smooth convex optimisation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5865339)