Minimization of transformed \(L_1\) penalty: theory, difference of convex function algorithm, and robust application in compressed sensing
From MaRDI portal
Publication:1749455
DOI10.1007/s10107-018-1236-xzbMath1386.94049arXiv1411.5735OpenAlexW2962689221MaRDI QIDQ1749455
Publication date: 16 May 2018
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1411.5735
convergence analysiscompressed sensingrobust recoverycoherent random matricesdifference of convex function algorithmsparse signal recovery theorytransformed \(l_1\) penalty
Applications of mathematical programming (90C90) Nonconvex programming, global optimization (90C26) Numerical optimization and variational techniques (65K10) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Related Items
A General Framework of Rotational Sparse Approximation in Uncertainty Quantification, A truncated Newton algorithm for nonconvex sparse recovery, Minimization of $L_1$ Over $L_2$ for Sparse Signal Recovery with Convergence Guarantee, An interior stochastic gradient method for a class of non-Lipschitz optimization problems, A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations, Minimizing L 1 over L 2 norms on the gradient, The springback penalty for robust signal recovery, A novel regularization based on the error function for sparse recovery, Generalized sparse recovery model and its neural dynamical optimization method for compressed sensing, Stable Image Reconstruction Using Transformed Total Variation Minimization, Transformed \(\ell_1\) regularization for learning sparse deep neural networks, Difference-of-Convex Learning: Directional Stationarity, Optimality, and Sparsity, Block sparse signal recovery via minimizing the block \(q\)-ratio sparsity, Transformed Schatten-1 penalty based full-rank latent label learning for incomplete multi-label classification, A nonconvex nonsmooth image prior based on the hyperbolic tangent function, Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum, A primal dual active set with continuation algorithm for high-dimensional nonconvex SICA-penalized regression, Sorted \(L_1/L_2\) minimization for sparse signal recovery, A wonderful triangle in compressed sensing, Distributed nonconvex constrained optimization over time-varying digraphs, \(\boldsymbol{L_1-\beta L_q}\) Minimization for Signal and Image Recovery, Enhanced total variation minimization for stable image reconstruction, Fast L1-L2 minimization via a proximal operator, A proximal difference-of-convex algorithm with extrapolation, Iteratively Reweighted Group Lasso Based on Log-Composite Regularization, Consistency bounds and support recovery of d-stationary solutions of sparse sample average approximations, A Scale-Invariant Approach for Sparse Signal Recovery, The modified second APG method for DC optimization problems, Limited-Angle CT Reconstruction via the $L_1/L_2$ Minimization, A Weighted Difference of Anisotropic and Isotropic Total Variation for Relaxed Mumford--Shah Color and Multiphase Image Segmentation
Uses Software
Cites Work
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Nearly unbiased variable selection under minimax concave penalty
- A unified approach to model selection and sparse recovery using regularized least squares
- DC approximation approaches for sparse optimization
- Point source super-resolution via non-convex \(L_1\) based methods
- Super-resolution from noisy data
- Computing sparse representation in a highly coherent dictionary based on difference of \(L_1\) and \(L_2\)
- Convex analysis approach to d. c. programming: Theory, algorithms and applications
- Minimization of transformed \(l_1\) penalty: closed form representation and iterative thresholding algorithms
- Transformed Schatten-1 iterative thresholding algorithms for low rank matrix completion
- DC Approximation Approach for ℓ0-minimization in Compressed Sensing
- Improved Iteratively Reweighted Least Squares for Unconstrained Smoothed $\ell_q$ Minimization
- Learning sparse classifiers with difference of convex functions algorithms
- A Method for Finding Structured Sparse Solutions to Nonnegative Least Squares Problems with Applications
- Coherence Pattern–Guided Compressive Sensing with Unresolved Grids
- Compressed sensing and best 𝑘-term approximation
- Alternating Direction Algorithms for $\ell_1$-Problems in Compressive Sensing
- SparseNet: Coordinate Descent With Nonconvex Penalties
- The Split Bregman Method for L1-Regularized Problems
- A Continuous Exact $\ell_0$ Penalty (CEL0) for Least Squares Regularized Problem
- DC Programming and DCA for General DC Programs
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Decoding by Linear Programming
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- A D.C. Optimization Algorithm for Solving the Trust-Region Subproblem
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Local Strong Homogeneity of a Regularized Estimator
- <formula formulatype="inline"><tex Notation="TeX">$L_{1/2}$</tex> </formula> Regularization: Convergence of Iterative Half Thresholding Algorithm
- Iterative <em>l</em><sub>1</sub> Minimization for Non-Convex Compressed Sensing
- Sparse Approximate Solutions to Linear Systems
- Matching pursuits with time-frequency dictionaries
- Minimization of $\ell_{1-2}$ for Compressed Sensing
- Difference-of-Convex Learning: Directional Stationarity, Optimality, and Sparsity
- Sparse Approximation via Penalty Decomposition Methods
- Bregman Iterative Algorithms for $\ell_1$-Minimization with Applications to Compressed Sensing
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Stable signal recovery from incomplete and inaccurate measurements
- Compressed sensing