Minimization of transformed \(L_1\) penalty: theory, difference of convex function algorithm, and robust application in compressed sensing

From MaRDI portal
Publication:1749455


DOI10.1007/s10107-018-1236-xzbMath1386.94049arXiv1411.5735OpenAlexW2962689221MaRDI QIDQ1749455

Shuai Zhang, Jack X. Xin

Publication date: 16 May 2018

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1411.5735



Related Items

A General Framework of Rotational Sparse Approximation in Uncertainty Quantification, A truncated Newton algorithm for nonconvex sparse recovery, Minimization of $L_1$ Over $L_2$ for Sparse Signal Recovery with Convergence Guarantee, An interior stochastic gradient method for a class of non-Lipschitz optimization problems, A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations, Minimizing L 1 over L 2 norms on the gradient, The springback penalty for robust signal recovery, A novel regularization based on the error function for sparse recovery, Generalized sparse recovery model and its neural dynamical optimization method for compressed sensing, Stable Image Reconstruction Using Transformed Total Variation Minimization, Transformed \(\ell_1\) regularization for learning sparse deep neural networks, Difference-of-Convex Learning: Directional Stationarity, Optimality, and Sparsity, Block sparse signal recovery via minimizing the block \(q\)-ratio sparsity, Transformed Schatten-1 penalty based full-rank latent label learning for incomplete multi-label classification, A nonconvex nonsmooth image prior based on the hyperbolic tangent function, Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum, A primal dual active set with continuation algorithm for high-dimensional nonconvex SICA-penalized regression, Sorted \(L_1/L_2\) minimization for sparse signal recovery, A wonderful triangle in compressed sensing, Distributed nonconvex constrained optimization over time-varying digraphs, \(\boldsymbol{L_1-\beta L_q}\) Minimization for Signal and Image Recovery, Enhanced total variation minimization for stable image reconstruction, Fast L1-L2 minimization via a proximal operator, A proximal difference-of-convex algorithm with extrapolation, Iteratively Reweighted Group Lasso Based on Log-Composite Regularization, Consistency bounds and support recovery of d-stationary solutions of sparse sample average approximations, A Scale-Invariant Approach for Sparse Signal Recovery, The modified second APG method for DC optimization problems, Limited-Angle CT Reconstruction via the $L_1/L_2$ Minimization, A Weighted Difference of Anisotropic and Isotropic Total Variation for Relaxed Mumford--Shah Color and Multiphase Image Segmentation


Uses Software


Cites Work