A Tight Bound of Hard Thresholding
From MaRDI portal
Publication:4558539
zbMath1473.62287arXiv1605.01656MaRDI QIDQ4558539
No author found.
Publication date: 22 November 2018
Full work available at URL: https://arxiv.org/abs/1605.01656
Related Items (12)
A new conjugate gradient hard thresholding pursuit algorithm for sparse signal recovery ⋮ Gradient projection Newton pursuit for sparsity constrained optimization ⋮ Improved RIP-based bounds for guaranteed performance of two compressed sensing algorithms ⋮ Unnamed Item ⋮ Heavy-ball-based hard thresholding algorithms for sparse signal recovery ⋮ A tight bound of modified iterative hard thresholding algorithm for compressed sensing. ⋮ Adaptive iterative hard thresholding for low-rank matrix recovery and rank-one measurements ⋮ An Equivalence between Critical Points for Rank Constraints Versus Low-Rank Factorizations ⋮ Jointly low-rank and bisparse recovery: Questions and partial answers ⋮ Binary sparse signal recovery with binary matching pursuit * ⋮ Unnamed Item ⋮ Unnamed Item
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A mathematical introduction to compressive sensing
- Sparse principal component analysis and iterative thresholding
- Restricted isometry property of matrices with independent columns and neighborly polytopes by random sampling
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Iterative hard thresholding for compressed sensing
- Iterative thresholding for sparse approximations
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- The restricted isometry property and its implications for compressed sensing
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Uniform uncertainty principle and signal recovery via regularized orthogonal matching pursuit
- A simple proof of the restricted isometry property for random matrices
- Introductory lectures on convex optimization. A basic course.
- Least angle regression. (With discussion)
- The convex geometry of linear inverse problems
- Sharp RIP bound for sparse signal and low-rank matrix recovery
- Simultaneous analysis of Lasso and Dantzig selector
- Bounds of restricted isometry constants in extreme asymptotics: formulae for Gaussian matrices
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Sparse Recovery Algorithms: Sufficient Conditions in Terms of Restricted Isometry Constants
- Performance comparisons of greedy algorithms in compressed sensing
- Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising
- Improved Bounds on Restricted Isometry Constants for Gaussian Matrices
- Hard Thresholding Pursuit: An Algorithm for Compressive Sensing
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Decoding by Linear Programming
- Stable recovery of sparse overcomplete representations in the presence of noise
- Greed is Good: Algorithmic Results for Sparse Approximation
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Fast Solution of $\ell _{1}$-Norm Minimization Problems When the Solution May Be Sparse
- Atomic Decomposition by Basis Pursuit
- Safe and Effective Importance Sampling
- Linear Convergence of Stochastic Iterative Greedy Algorithms With Sparse Constraints
- On the Recovery Limit of Sparse Signals Using Orthogonal Matching Pursuit
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Subspace Pursuit for Compressive Sensing Signal Reconstruction
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- A Remark on the Restricted Isometry Property in Orthogonal Matching Pursuit
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Sparse Recovery With Orthogonal Matching Pursuit Under RIP
- New Bounds for Restricted Isometry Constants
- Greedy Sparsity-Constrained Optimization
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Compressed sensing
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
This page was built for publication: A Tight Bound of Hard Thresholding