A flexible coordinate descent method
DOI10.1007/S10589-018-9984-3zbMATH Open1391.90410arXiv1507.03713OpenAlexW2963332075MaRDI QIDQ1639710FDOQ1639710
Authors: Kimon Fountoulakis, Rachael Tappenden
Publication date: 13 June 2018
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1507.03713
Recommendations
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Efficiency of coordinate descent methods on huge-scale optimization problems
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- On the complexity analysis of randomized block-coordinate descent methods
large scale optimizationiteration complexitynonsmooth problemssecond-order methodsblock coordinate descentrandomizedcurvature information
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Numerical methods based on nonlinear programming (49M37) Methods of quasi-Newton type (90C53) Newton-type methods (49M15)
Cites Work
- IMRO: A proximal quasi-Newton method for solving \(\ell_1\)-regularized least squares problems
- Numerical Optimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- Introductory lectures on convex optimization. A basic course.
- Compressive sampling
- A coordinate gradient descent method for nonsmooth separable minimization
- Standardization and the group lasso penalty
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Compressed sensing
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- First-order methods of smooth convex optimization with inexact oracle
- Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- Parallel coordinate descent methods for big data optimization
- Accelerated block-coordinate relaxation for regularized optimization
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Inexact coordinate descent: complexity and preconditioning
- Accelerated, parallel, and proximal coordinate descent
- Parallel Selective Algorithms for Nonconvex Big Data Optimization
- Hybrid Random/Deterministic Parallel Algorithms for Convex and Nonconvex Big Data Optimization
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- On the complexity of parallel coordinate descent
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- Stochastic dual coordinate ascent methods for regularized loss minimization
- On the convergence of inexact block coordinate descent methods for constrained optimization
- Efficient block-coordinate descent algorithms for the group Lasso
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Practical inexact proximal quasi-Newton method with global complexity analysis
- On the complexity analysis of randomized block-coordinate descent methods
- Proximal Newton-type methods for minimizing composite functions
- Nonsmooth mechanics and analysis. Theoretical and numerical advances
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
Cited In (18)
- Inexact variable metric stochastic block-coordinate descent for regularized optimization
- Inexact coordinate descent: complexity and preconditioning
- A coordinate descent method for total variation minimization
- Title not available (Why is that?)
- Accelerating block coordinate descent methods with identification strategies
- Title not available (Why is that?)
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- Linear convergence of randomized feasible descent methods under the weak strong convexity assumption
- Semi-stochastic coordinate descent
- Second order semi-smooth proximal Newton methods in Hilbert spaces
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- Convergence analysis of inexact randomized iterative methods
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- A randomized coordinate descent method with volume sampling
- Distributed block coordinate descent for minimizing partially separable functions
Uses Software
This page was built for publication: A flexible coordinate descent method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1639710)