A generic coordinate descent solver for non-smooth convex optimisation
From MaRDI portal
Publication:5865339
DOI10.1080/10556788.2019.1658758zbMATH Open1494.90077arXiv1812.00628OpenAlexW2970475689WikidataQ127312507 ScholiaQ127312507MaRDI QIDQ5865339FDOQ5865339
Authors: Olivier Fercoq
Publication date: 13 June 2022
Published in: Optimization Methods \& Software (Search for Journal in Brave)
Abstract: We present a generic coordinate descent solver for the minimization of a nonsmooth convex objective with structure. The method can deal in particular with problems with linear constraints. The implementation makes use of efficient residual updates and automatically determines which dual variables should be duplicated. A list of basic functional atoms is pre-compiled for efficiency and a modelling language in Python allows the user to combine them at run time. So, the algorithm can be used to solve a large variety of problems including Lasso, sparse multinomial logistic regression, linear and quadratic programs.
Full work available at URL: https://arxiv.org/abs/1812.00628
Recommendations
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- A coordinate gradient descent method for nonsmooth separable minimization
- A descent algorithm for nonsmooth convex optimization
- A coordinate descent homotopy method for linearly constrained nonsmooth convex minimization
- Publication:3479818
- scientific article; zbMATH DE number 7109378
- scientific article; zbMATH DE number 7366152
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- A descent nonlinear conjugate gradient method for large-scale unconstrained optimization
Cites Work
- LIBLINEAR: a library for large linear classification
- CVXPY: a Python-embedded modeling language for convex optimization
- Scikit-learn: machine learning in Python
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- SparseNet: coordinate descent with nonconvex penalties
- Pathwise coordinate optimization
- A unified convergence analysis of block successive minimization methods for nonsmooth optimization
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- A block coordinate descent method for regularized multiconvex optimization with applications to nonnegative tensor factorization and completion
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Parallel coordinate descent methods for big data optimization
- Distributed coordinate descent method for learning with big data
- Efficiency of coordinate descent methods on huge-scale optimization problems
- On optimal probabilities in stochastic coordinate descent methods
- Randomized methods for linear constraints: convergence rates and conditioning
- Accelerated, parallel, and proximal coordinate descent
- Title not available (Why is that?)
- Title not available (Why is that?)
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping
- Coordinate descent algorithms
- Stochastic dual coordinate ascent methods for regularized loss minimization
- On the convergence of block coordinate descent type methods
- A block successive upper-bound minimization method of multipliers for linearly constrained convex optimization
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- On the complexity analysis of randomized block-coordinate descent methods
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- On the convergence of the coordinate descent method for convex differentiable minimization
- Dual coordinate descent methods for logistic regression and maximum entropy models
- Minimizing Certain Convex Functions
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Randomized primal-dual proximal block coordinate updates
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- A New Randomized Block-Coordinate Primal-Dual Proximal Algorithm for Distributed Optimization
- On the iteration complexity of cyclic coordinate gradient descent methods
- A smooth primal-dual optimization framework for nonsmooth composite convex minimization
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- On the complexity analysis of the primal solutions for the accelerated randomized dual coordinate ascent
Cited In (2)
Uses Software
This page was built for publication: A generic coordinate descent solver for non-smooth convex optimisation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5865339)