A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
From MaRDI portal
Publication:461437
DOI10.1007/s10589-013-9598-8zbMath1304.90160arXiv1302.3074OpenAlexW1999593606MaRDI QIDQ461437
Ion Necoara, Andrei T. Patrascu
Publication date: 10 October 2014
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1302.3074
convergence raterandomized algorithmscoordinate descent\(\mathcal{O}(1/\epsilon)\)composite objective functionlinearly coupled constraints
Related Items (29)
Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems ⋮ Selective bi-coordinate variations for resource allocation type problems ⋮ An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization ⋮ Inexact coordinate descent: complexity and preconditioning ⋮ A flexible coordinate descent method ⋮ Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds ⋮ Active-Set Identification with Complexity Guarantees of an Almost Cyclic 2-Coordinate Descent Method with Armijo Line Search ⋮ Random block coordinate descent methods for linearly constrained optimization over networks ⋮ Separable approximations and decomposition methods for the augmented Lagrangian ⋮ New method for solving Ivanov regularization-based support vector machine learning ⋮ An almost cyclic 2-coordinate descent method for singly linearly constrained problems ⋮ Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties ⋮ Local linear convergence of proximal coordinate descent algorithm ⋮ Hybrid Jacobian and Gauss--Seidel Proximal Block Coordinate Update Methods for Linearly Constrained Convex Programming ⋮ Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems ⋮ Blocks of coordinates, stochastic programming, and markets ⋮ Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization ⋮ A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions ⋮ Fejér-monotone hybrid steepest descent method for affinely constrained and composite convex minimization tasks ⋮ Multi-label Lagrangian support vector machine with random block coordinate descent method ⋮ Unnamed Item ⋮ A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization ⋮ On the generally randomized extended Gauss-Seidel method ⋮ Coordinate descent with arbitrary sampling I: algorithms and complexity† ⋮ Coordinate descent with arbitrary sampling II: expected separable overapproximation ⋮ Stochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random Sweeping ⋮ Unnamed Item ⋮ Proximal Gradient Methods with Adaptive Subspace Sampling ⋮ Parallel coordinate descent methods for big data optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- Inexact coordinate descent: complexity and preconditioning
- Approximate level method for nonsmooth convex minimization
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- Optimal scaling of a gradient method for distributed resource allocation
- On the solution of the symmetric eigenvalue complementarity problem by the spectral projected gradient algorithm
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- Decomposition algorithm model for singly linearly-constrained problems subject to lower and Upper bounds
- Introductory lectures on convex optimization. A basic course.
- Solution methodologies for the smallest enclosing circle problem
- Efficient block-coordinate descent algorithms for the group Lasso
- Random block coordinate descent methods for linearly constrained optimization over networks
- On linear-time algorithms for the continuous quadratic Knapsack problem
- New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds
- Atomic Decomposition by Basis Pursuit
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Primal-Dual Subgradient Method for Huge-Scale Linear Conic Problems
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Interior-Point Methods for Massive Support Vector Machines
- Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- On the Convergence of Block Coordinate Descent Type Methods
- Model Selection and Estimation in Regression with Grouped Variables
- Learning Theory
This page was built for publication: A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints