A coordinate gradient descent method for _1-regularized convex minimization
DOI10.1007/S10589-009-9251-8zbMATH Open1220.90092OpenAlexW1982941867MaRDI QIDQ535291FDOQ535291
Authors: Sangwoon Yun, Kim-Chuan Toh
Publication date: 11 May 2011
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-009-9251-8
Recommendations
- A coordinate gradient descent method for nonsmooth separable minimization
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- Coordinate descent optimization for \(l^{1}\) minimization with application to compressed sensing; a greedy algorithm
- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
- On the iteration complexity of cyclic coordinate gradient descent methods
logistic regressionconvex optimizationcompressed sensingimage deconvolutionlinear least squares\(\ell_{1}\)-regularizationcoordinate gradient descentQ-linear convergence
Cites Work
- Least angle regression. (With discussion)
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Atomic Decomposition by Basis Pursuit
- Just relax: convex programming methods for identifying sparse signals in noise
- A new approach to variable selection in least squares problems
- A coordinate gradient descent method for nonsmooth separable minimization
- Trust region Newton method for logistic regression
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Compressed sensing
- Sparse Reconstruction by Separable Approximation
- An EM algorithm for wavelet-based image restoration
- On Sparse Representations in Arbitrary Redundant Bases
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Breakdown of equivalence between the minimal \(\ell^1\)-norm solution and the sparsest solution
- Error bounds and convergence analysis of feasible descent methods: A general approach
- An interior-point method for large-scale \(l_1\)-regularized logistic regression
- LASSO-pattern search algorithm with application to ophthalmology and genomic data
- Mathematical Programming for Data Mining: Formulations and Challenges
- Fast Solution of $\ell _{1}$-Norm Minimization Problems When the Solution May Be Sparse
- Wavelets and curvelets for image deconvolution: a combined approach
- Large scale kernel regression via linear programming
Cited In (43)
- Stochastic block-coordinate gradient projection algorithms for submodular maximization
- Nonmonotone adaptive Barzilai-Borwein gradient algorithm for compressed sensing
- A coordinate descent method for total variation minimization
- Iterative reweighted minimization methods for \(l_p\) regularized unconstrained nonlinear programming
- Two approaches for solving \(l_1\)-regularized least squares with application to truss topology design
- A multilevel framework for sparse optimization with application to inverse covariance estimation and logistic regression
- A fast hybrid algorithm for large-scale \(l_{1}\)-regularized logistic regression
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- Generalized conjugate gradient methods for \(\ell_1\) regularized convex quadratic programming with finite convergence
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- A distributed block coordinate descent method for training \(l_1\) regularized linear classifiers
- A new spectral method for \(l_1\)-regularized minimization
- A coordinate descent homotopy method for linearly constrained nonsmooth convex minimization
- Survey of solving the optimization problems for sparse learning
- Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization
- A smoothing SQP framework for a class of composite \(L_q\) minimization over polyhedron
- Primal and dual alternating direction algorithms for \(\ell _{1}\)-\(\ell _{1}\)-norm minimization problems in compressive sensing
- Another hybrid approach for solving monotone operator equations and application to signal processing
- An inexact coordinate descent method for the weighted \(l_{1}\)-regularized convex optimization problem
- On the complexity analysis of randomized block-coordinate descent methods
- A randomized nonmonotone block proximal gradient method for a class of structured nonlinear programming
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Block coordinate descent algorithms for large-scale sparse multiclass classification
- Nonmonotone Barzilai-Borwein gradient algorithm for \(\ell_1\)-regularized nonsmooth minimization in compressive sensing
- Screening for a reweighted penalized conditional gradient method
- Non-smooth equations based method for \(\ell_1\)-norm problems with applications to compressed sensing
- Gradient iteration with \(\ell _{p}\)-norm constraints
- Primal-dual first-order methods for a class of cone programming
- Orthogonal rank-one matrix pursuit for low rank matrix completion
- Differential network inference via the fused D-trace loss with cross variables
- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
- Large sparse signal recovery by conjugate gradient algorithm based on smoothing technique
- New nonsmooth equations-based algorithms for \(\ell_1\)-norm minimization and applications
- Combining line search and trust-region methods forℓ1-minimization
- A simple and feasible method for a class of large-scale \(l^1\)-problems
- On the convergence of inexact block coordinate descent methods for constrained optimization
- Incomplete variables truncated conjugate gradient method for signal reconstruction in compressed sensing
- Graphical Lasso and thresholding: equivalence and closed-form solutions
- A gradient descent algorithm for LASSO
- Minimization of \(\ell_{1-2}\) for compressed sensing
- A reduced-space algorithm for minimizing \(\ell_1\)-regularized convex functions
- A coordinate gradient descent method for nonsmooth separable minimization
- Title not available (Why is that?)
Uses Software
This page was built for publication: A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q535291)