An iteratively reweighted least squares algorithm for sparse regularization
From MaRDI portal
Publication:4635396
DOI10.1090/CONM/693/13941zbMATH Open1392.65074arXiv1511.08970OpenAlexW2263124147MaRDI QIDQ4635396FDOQ4635396
Ingrid Daubechies, S. M. Voronin
Publication date: 16 April 2018
Published in: Functional Analysis, Harmonic Analysis, and Image Processing (Search for Journal in Brave)
Abstract: We present a new algorithm and the corresponding convergence analysis for the regularization of linear inverse problems with sparsity constraints, applied to a new generalized sparsity promoting functional. The algorithm is based on the idea of iteratively reweighted least squares, reducing the minimization at every iteration step to that of a functional including only -norms. This amounts to smoothing of the absolute value function that appears in the generalized sparsity promoting penalty we consider, with the smoothing becoming iteratively less pronounced. We demonstrate that the sequence of iterates of our algorithm converges to a limit that minimizes the original functional.
Full work available at URL: https://arxiv.org/abs/1511.08970
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Linearized Bregman iterations for compressed sensing
- Alternating Direction Algorithms for $\ell_1$-Problems in Compressive Sensing
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Bregman Iterative Algorithms for $\ell_1$-Minimization with Applications to Compressed Sensing
- Iteratively reweighted least squares minimization for sparse recovery
- Conjugate gradient acceleration of iteratively re-weighted least squares methods
Cited In (9)
- LSMR: An Iterative Algorithm for Sparse Least-Squares Problems
- Wavelet estimation of the dimensionality of curve time series
- Conjugate gradient acceleration of iteratively re-weighted least squares methods
- Sparsity-enforcing regularisation and ISTA revisited
- Sparse matrix transform based weight updating in partial least squares regression
- Conjugate gradient based acceleration for inverse problems
- Title not available (Why is that?)
- Enhancing Sparsity and Resolution via Reweighted Atomic Norm Minimization
- Proximal methods for reweighted \(l_Q\)-regularization of sparse signal recovery
Recommendations
- Title not available (Why is that?) π π
- LSMR: An Iterative Algorithm for Sparse Least-Squares Problems π π
- Iteratively Reweighted Least Squares: Algorithms, Convergence Analysis, and Numerical Comparisons π π
- Iteratively reweighted least squares minimization for sparse recovery π π
- Iterative regularization with minimum-residual methods π π
- An iterative algorithm with adaptive weights and sparse Laplacian shrinkage for regression problems π π
- Smoothed Low Rank and Sparse Matrix Recovery by Iteratively Reweighted Least Squares Minimization π π
- An iterative algorithm for large size least-squares constrained regularization problems π π
- Sparse signal recovery with prior information by iterative reweighted least squares algorithm π π
- New regularization method and iteratively reweighted algorithm for sparse vector recovery π π
This page was built for publication: An iteratively reweighted least squares algorithm for sparse regularization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4635396)