A parallel line search subspace correction method for composite convex optimization
From MaRDI portal
Publication:2516372
DOI10.1007/s40305-015-0079-xzbMath1317.90234OpenAlexW757760243MaRDI QIDQ2516372
Xin Liu, Qian Dong, Ya-Xiang Yuan, ZaiWen Wen
Publication date: 31 July 2015
Published in: Journal of the Operations Research Society of China (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s40305-015-0079-x
domain decompositionblock coordinate descent methodline searchdistributed optimizationJacobian-type iteration
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Parallel coordinate descent methods for big data optimization
- On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems
- On the complexity analysis of randomized block-coordinate descent methods
- Iteration complexity analysis of block coordinate descent methods
- A convergent overlapping domain decomposition method for total variation minimization
- On the sublinear convergence rate of multi-block ADMM
- A coordinate gradient descent method for nonsmooth separable minimization
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On the convergence of the coordinate descent method for convex differentiable minimization
- Parallel multi-block ADMM with \(o(1/k)\) convergence
- Bregmanized domain decomposition for image restoration
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- On the proximal Jacobian decomposition of ALM for multiple-block separable convex minimization problems and its relationship to ADMM
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Optimization theory and methods. Nonlinear programming
- Global and uniform convergence of subspace correction methods for some convex optimization problems
- On the convergence of an active-set method for ℓ1minimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Subspace Correction Methods for Total Variation and $\ell_1$-Minimization
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- Accelerated, Parallel, and Proximal Coordinate Descent
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Solving Multiple-Block Separable Convex Minimization Problems Using Two-Block Alternating Direction Method of Multipliers
- Two-Point Step Size Gradient Methods
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Wavelet Decomposition Method for $L_2/$/TV-Image Deblurring
- On the Convergence of Block Coordinate Descent Type Methods
- Domain decomposition methods for linear inverse problems with sparsity constraints
- A Schur complement based semi-proximal ADMM for convex quadratic conic programming and extensions
This page was built for publication: A parallel line search subspace correction method for composite convex optimization