A simple homotopy proximal mapping algorithm for compressive sensing
From MaRDI portal
Publication:2425244
DOI10.1007/S10994-018-5772-7zbMATH Open1493.94008OpenAlexW2901183496WikidataQ128937890 ScholiaQ128937890MaRDI QIDQ2425244FDOQ2425244
Tianbao Yang, Lijun Zhang, Zhi-Hua Zhou, Rong Jin, Shenghuo Zhu
Publication date: 26 June 2019
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-018-5772-7
Cites Work
- NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
- Probing the Pareto Frontier for Basis Pursuit Solutions
- Least angle regression. (With discussion)
- Title not available (Why is that?)
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- Atomic Decomposition by Basis Pursuit
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Extensions of Lipschitz mappings into a Hilbert space
- Matching pursuits with time-frequency dictionaries
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Decoding by Linear Programming
- Just relax: convex programming methods for identifying sparse signals in noise
- A new approach to variable selection in least squares problems
- Stable signal recovery from incomplete and inaccurate measurements
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Convex Analysis
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Compressed sensing
- Sparse Reconstruction by Separable Approximation
- Iterative hard thresholding for compressed sensing
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Uniform uncertainty principle and signal recovery via regularized orthogonal matching pursuit
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- Greed is Good: Algorithmic Results for Sparse Approximation
- Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
- The Computational Complexity of the Restricted Isometry Property, the Nullspace Property, and Related Concepts in Compressed Sensing
- Adaptive greedy approximations
- Title not available (Why is that?)
- A Bound on Tail Probabilities for Quadratic Forms in Independent Random Variables
- Bonferroni inequalities
- Sparse Representation of a Polytope and Recovery of Sparse Signals and Low-Rank Matrices
- The restricted isometry property and its implications for compressed sensing
- Atomic decomposition by basis pursuit
- One-bit compressed sensing by linear programming
- Matrix recipes for hard thresholding methods
- Sparse Solution of Underdetermined Systems of Linear Equations by Stagewise Orthogonal Matching Pursuit
- Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising
- The Noise-Sensitivity Phase Transition in Compressed Sensing
- Hard Thresholding Pursuit: An Algorithm for Compressive Sensing
- Fast Solution of $\ell _{1}$-Norm Minimization Problems When the Solution May Be Sparse
- Linear convergence of iterative soft-thresholding
- An infeasible-point subgradient method using adaptive approximate projections
- Solving Basis Pursuit
- A proximal-gradient homotopy method for the sparse least-squares problem
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- A sparse Johnson-Lindenstrauss transform
- Sparser Johnson-Lindenstrauss Transforms
- Decomposable norm minimization with proximal-gradient homotopy algorithm
- Sparse Recovery of Streaming Signals Using <formula formulatype="inline"><tex Notation="TeX">$\ell_1$</tex></formula>-Homotopy
- A primal-dual homotopy algorithm for \(\ell _{1}\)-minimization with \(\ell _{\infty }\)-constraints
- Sharp Time–Data Tradeoffs for Linear Inverse Problems
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
Cited In (1)
Uses Software
This page was built for publication: A simple homotopy proximal mapping algorithm for compressive sensing
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2425244)