An unbiased approach to compressed sensing
From MaRDI portal
Publication:5132273
DOI10.1088/1361-6420/abbd7fzbMath1458.94080arXiv1806.05283OpenAlexW2808430519MaRDI QIDQ5132273
Daniele Gerosa, Carl Olsson, Marcus Carlsson
Publication date: 10 November 2020
Published in: Inverse Problems (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1806.05283
Derivative-free methods and methods using generalized derivatives (90C56) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Related Items
An Unbiased Approach to Low Rank Recovery ⋮ Bias versus non-convexity in compressed sensing ⋮ On convex envelopes and regularization of non-convex functionals without moving global minima ⋮ Von Neumann's trace inequality for Hilbert-Schmidt operators
Uses Software
Cites Work
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- Relationship between the optimal solutions of least squares regularized with \(\ell_{0}\)-norm and constrained by \(k\)-sparsity
- A mathematical introduction to compressive sensing
- Optimal computational and statistical rates of convergence for sparse nonconvex learning problems
- Generalized sampling and infinite-dimensional compressed sensing
- Support recovery without incoherence: a case for nonconvex regularization
- Iterative hard thresholding for compressed sensing
- Iterative thresholding for sparse approximations
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- The restricted isometry property and its implications for compressed sensing
- One-step sparse estimates in nonconcave penalized likelihood models
- Relaxed sparse eigenvalue conditions for sparse estimation via non-convex regularized regression
- Convex envelopes for fixed rank approximation
- Global convergence of ADMM in nonconvex nonsmooth optimization
- Nonconcave penalized likelihood with a diverging number of parameters.
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Convex low rank approximation
- On convex envelopes and regularization of non-convex functionals without moving global minima
- Minimization of non-smooth, non-convex functionals by iterative thresholding
- Strong oracle optimality of folded concave penalized estimation
- Atomic Decomposition by Basis Pursuit
- Description of the Minimizers of Least Squares Regularized with $\ell_0$-norm. Uniqueness of the Global Minimizer
- BREAKING THE COHERENCE BARRIER: A NEW THEORY FOR COMPRESSED SENSING
- Robust principal component analysis?
- Compressed Sensing: How Sharp Is the Restricted Isometry Property?
- SparseNet: Coordinate Descent With Nonconvex Penalties
- A Continuous Exact $\ell_0$ Penalty (CEL0) for Least Squares Regularized Problem
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Decoding by Linear Programming
- Stable recovery of sparse overcomplete representations in the presence of noise
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sparse Regularization via Convex Analysis
- Sparse Approximate Solutions to Linear Systems
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Signal Recovery by Proximal Forward-Backward Splitting
- Stable signal recovery from incomplete and inaccurate measurements
- Convex analysis and monotone operator theory in Hilbert spaces
- A general theory of concave regularization for high-dimensional sparse estimation problems