Coordinate descent optimization for \(l^{1}\) minimization with application to compressed sensing; a greedy algorithm
From MaRDI portal
Publication:2268312
DOI10.3934/IPI.2009.3.487zbMath1188.90196OpenAlexW2787198218MaRDI QIDQ2268312
Ying-Ying Li, Stanley J. Osher
Publication date: 10 March 2010
Published in: Inverse Problems and Imaging (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/ipi.2009.3.487
Related Items (23)
An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization ⋮ A variational method for Abel inversion tomography with mixed Poisson-Laplace-Gaussian noise ⋮ An alternating maximization method for approximating the hump of the matrix exponential ⋮ Separable approximations and decomposition methods for the augmented Lagrangian ⋮ Algorithm for overcoming the curse of dimensionality for time-dependent non-convex Hamilton-Jacobi equations arising from optimal control and differential games problems ⋮ A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming ⋮ A unified primal-dual algorithm framework based on Bregman iteration ⋮ Heat source identification based on \(\ell_1\) constrained minimization ⋮ Efficient LED-SAC sparse estimator using fast sequential adaptive coordinate-wise optimization (LED-2SAC) ⋮ Stochastic block-coordinate gradient projection algorithms for submodular maximization ⋮ Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function ⋮ Nonnegative control of finite-dimensional linear systems ⋮ On the complexity analysis of randomized block-coordinate descent methods ⋮ On the complexity of parallel coordinate descent ⋮ A coordinate descent method for total variation minimization ⋮ A coordinate descent based method for geometry optimization of trusses ⋮ Block Coordinate Descent Methods for Semidefinite Programming ⋮ Markov chain block coordinate descent ⋮ Sparse group fused Lasso for model segmentation: a hybrid approach ⋮ An Adaptive Fourier Filter for Relaxing Time Stepping Constraints for Explicit Solvers ⋮ A Network of Spiking Neurons for Computing Sparse Representations in an Energy-Efficient Way ⋮ Learning and meta-learning of stochastic advection–diffusion–reaction systems from sparse measurements ⋮ Parallel coordinate descent methods for big data optimization
This page was built for publication: Coordinate descent optimization for \(l^{1}\) minimization with application to compressed sensing; a greedy algorithm