swMATH8727MaRDI QIDQ20727FDOQ20727
Author name not available (Why is that?)
Official website: http://www.sciencedirect.com/science/article/pii/S1063520308000638
Cited In (only showing first 100 items - show all)
- Orthogonal one step greedy procedure for heteroscedastic linear models
- Iterative hard thresholding for compressed sensing
- Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure
- A mathematical introduction to compressive sensing
- Sparse Legendre expansions via \(\ell_1\)-minimization
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- Bayesian inference for spatio-temporal spike-and-slab priors
- Coherence pattern-guided compressive sensing with unresolved grids
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Relationship between the optimal solutions of least squares regularized with \(\ell_{0}\)-norm and constrained by \(k\)-sparsity
- A geometrical stability condition for compressed sensing
- Phase transitions for greedy sparse approximation algorithms
- Signal Space CoSaMP for Sparse Recovery With Redundant Dictionaries
- A null space analysis of the \(\ell_1\)-synthesis method in dictionary-based compressed sensing
- Fast Phase Retrieval from Local Correlation Measurements
- LOL selection in high dimension
- Spectral compressive sensing
- Theory and applications of compressed sensing
- A non-adapted sparse approximation of PDEs with stochastic inputs
- Analysis of orthogonal multi-matching pursuit under restricted isometry property
- Robustness of orthogonal matching pursuit under restricted isometry property
- Convergence of fixed-point continuation algorithms for matrix rank minimization
- Sparse polynomial chaos expansions via compressed sensing and D-optimal design
- Compressed sensing and dynamic mode decomposition
- Compressed blind signal reconstruction model and algorithm
- A numerical exploration of compressed sampling recovery
- Compressed sensing with coherent and redundant dictionaries
- Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems
- CVX
- levmar
- PDCO
- BLOOMP
- Spot
- NESTA
- RCV1
- SPGL1
- TVAL3
- iPiano
- SparseLab
- MAG
- ESSC
- Mulprec
- Yall1
- na28
- Delta Sigma
- FPC_AS
- GAGA
- LMaFit
- Jellyfish
- L1-MAGIC
- Optspace
- TFOCS
- NetQuest
- ggks
- SparseFIS
- l1_ls
- CYCLADES
- Rice Wavelet Toolbox
- AccFFT
- DLMRI-Lab
- BrainWeb
- PROGRESS
- SAR
- TwIST
- GSPBOX
- C-HiLasso
- PhaseMax
- GESPAR
- SparsePR
- BlockPR
- ADMiRA
- Multi-PIE
- LPCCbnc
- LPbook
- PhaseLift
- Wirtinger Flow
- Quikr
- Fast <it>k</it>-selection algorithms for graphics processing units
- ADMiRA: Atomic Decomposition for Minimum Rank Approximation
- Robust sparse phase retrieval made easy
- On the minimization over sparse symmetric sets: projections, optimality conditions, and algorithms
- foba
- DCT_shortsupp_oneblock
- L0Learn
- Data-driven time-frequency analysis
- abess
- Quasi-linear compressed sensing
- A survey of compressed sensing
- Matrix recipes for hard thresholding methods
- Structured sparsity through convex optimization
- NESTA: A fast and accurate first-order method for sparse recovery
- Suprema of chaos processes and the restricted isometry property
- Decomposition into low-rank plus additive matrices for background/foreground separation: a review for a comparative evaluation with a large-scale dataset
- Newton method for \(\ell_0\)-regularized optimization
- Newton-type optimal thresholding algorithms for sparse optimization problems
- Multi-layer sparse coding: the holistic way
- Bayesian approach with extended support estimation for sparse linear regression
- Increasing the semantic storage density of sparse distributed memory
- Hierarchical compressed sensing
This page was built for software: CoSaMP