Uncertainty principles and ideal atomic decomposition
From MaRDI portal
Publication:4544728
combinatorial optimizationconvex optimizationuncertainty principleharmonic analysiswavelet analysismatching pursuitbasis pursuitdiscrete-time signalovercomplete representationridgelet analysisLogan's phenomenonerror-correcting encryptionhighly sparse representationmultiple-basis signal representation
Recommendations
- Quantitative robust uncertainty principles and optimally sparse decompositions
- Uncertainty Principles and Signal Recovery
- A generalized uncertainty principle and sparse representation in pairs of bases
- Atomic Decomposition by Basis Pursuit
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
Cited in
(only showing first 100 items - show all)- Typical \(l_1\)-recovery limit of sparse vectors represented by concatenations of random orthogonal matrices
- Model selection with distributed SCAD penalty
- Model Selection With Lasso-Zero: Adding Straw to the Haystack to Better Find Needles
- Unconstrained \(\ell_1\)-\(\ell_2\) minimization for sparse recovery via mutual coherence
- A new bound on the block restricted isometry constant in compressed sensing
- Kernelized elastic net regularization: generalization bounds, and sparse recovery
- A performance guarantee for orthogonal matching pursuit using mutual coherence
- Sparse approximate reconstruction decomposed by two optimization problems
- A hierarchical framework for recovery in compressive sensing
- GMRES-accelerated ADMM for quadratic objectives
- Frames as codes
- On the sparsity of Lasso minimizers in sparse data recovery
- Sparse signals recovery from noisy measurements by orthogonal matching pursuit
- An extended DEIM algorithm for subset selection and class identification
- Evaluating visual properties via robust HodgeRank
- Beyond coherence: Recovering structured time-frequency representations
- Uncertainty principle corresponding to an orthonormal wavelet system
- Robust Identification of Differential Equations by Numerical Techniques from a Single Set of Noisy Observation
- Existence, uniqueness, and approximation solutions to linearized Chandrasekhar equation with sharp bounds
- DC approximation approach for \(\ell_0\)-minimization in compressed sensing
- Stable restoration and separation of approximately sparse signals
- An evaluation of the sparsity degree for sparse recovery with deterministic measurement matrices
- Weak stability of \(\ell_1\)-minimization methods in sparse data reconstruction
- \(\ell_1-\alpha\ell_2\) minimization methods for signal and image reconstruction with impulsive noise removal
- Generalized regression estimators with high-dimensional covariates
- Analysis of the ratio of \(\ell_1\) and \(\ell_2\) norms in compressed sensing
- Sparsity-promoting and edge-preserving maximum a posteriori estimators in non-parametric Bayesian inverse problems
- Local recovery bounds for prior support constrained compressed sensing
- Stability analysis of a class of sparse optimization problems
- Spherical designs and nonconvex minimization for recovery of sparse signals on the sphere
- On the uncertainty inequality as applied to discrete signals
- Model selection with low complexity priors
- Recovery of signals under the condition on RIC and ROC via prior support information
- Block-sparse recovery of semidefinite systems and generalized null space conditions
- Alternating direction method of multipliers for solving dictionary learning models
- Recovery of block sparse signals under the conditions on block RIC and ROC by BOMP and BOMMP
- Dictionary evaluation and optimization for sparse coding based speech processing
- Adaptive multi-penalty regularization based on a generalized Lasso path
- Optimal delocalization for generalized Wigner matrices
- A smoothing method for sparse optimization over convex sets
- Smoothing Newton method for \(\ell^0\)-\(\ell^2\) regularized linear inverse problem
- Unsupervised learning of compositional sparse code for natural image representation
- Sparse recovery in probability via \(l_q\)-minimization with Weibull random matrices for \(0 < q\leq 1\)
- A Scale-Invariant Approach for Sparse Signal Recovery
- Sparse recovery of sound fields using measurements from moving microphones
- A multiple measurement vector approach to synthetic aperture radar imaging
- Greedy subspace pursuit for joint sparse recovery
- An overview on the applications of matrix theory in wireless communications and signal processing
- Incoherent dictionary learning method based on unit norm tight frame and manifold optimization for sparse representation
- On the grouping effect of the \(l_{1-2}\) models
- Iteratively reweighted least squares and slime mold dynamics: connection and convergence
- Sharp sufficient conditions for stable recovery of block sparse signals by block orthogonal matching pursuit
- Strengthening hash families and compressive sensing
- IDENT: identifying differential equations with numerical time evolution
- Constructing New Weighted ℓ1-Algorithms for the Sparsest Points of Polyhedral Sets
- Asymptotic theory of \(\ell_1\)-regularized PDE identification from a single noisy trajectory
- Sparse recovery under weak moment assumptions
- Theory and applications of compressed sensing
- Global testing under sparse alternatives: ANOVA, multiple comparisons and the higher criticism
- A Practical Randomized CP Tensor Decomposition
- Sparse linear regression from perturbed data
- Recent advances in mathematical programming with semi-continuous variables and cardinality constraint
- Compressive Sensing
- Computing and analyzing recoverable supports for sparse reconstruction
- The asymptotic distribution and Berry-Esseen bound of a new test for independence in high dimension with an application to stochastic optimization
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- On the conditioning of random subdictionaries
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Limiting laws of coherence of random matrices with applications to testing covariance structure and construction of compressed sensing matrices
- Sharp RIP bound for sparse signal and low-rank matrix recovery
- Matrix-free interior point method for compressed sensing problems
- Null space conditions and thresholds for rank minimization
- A new computational method for the sparsest solutions to systems of linear equations
- A gradient enhanced \(\ell_{1}\)-minimization for sparse approximation of polynomial chaos expansions
- Estimating the dimension of a model
- Exact low-rank matrix recovery via nonconvex Schatten \(p\)-minimization
- A sharp RIP condition for orthogonal matching pursuit
- Regularity properties for sparse regression
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Computational Aspects of Constrained L 1-L 2 Minimization for Compressive Sensing
- Compressed sensing from a harmonic analysis point of view
- Beyond sparsity: recovering structured representations by \({\ell}^1\) minimization and greedy algorithms
- On verifiable sufficient conditions for sparse signal recovery via \(\ell_{1}\) minimization
- Sparse representations and approximation theory
- On the uniqueness of overcomplete dictionaries, and a practical way to retrieve them
- On a unified view of nullspace-type conditions for recoveries associated with general sparsity structures
- Randomized first order algorithms with applications to \(\ell _{1}\)-minimization
- Phase transition in limiting distributions of coherence of high-dimensional random matrices
- Bayesian factor-adjusted sparse regression
- Verifiable conditions of \(\ell_{1}\)-recovery for sparse signals with sign restrictions
- Covariate assisted screening and estimation
- Uncertainty Principles and Signal Recovery
- Quantitative robust uncertainty principles and optimally sparse decompositions
- Feature selection when there are many influential features
- Linear program relaxation of sparse nonnegative recovery in compressive sensing microarrays
- Sparse estimation via lower-order penalty optimization methods in high-dimensional linear regression
- Best subset selection via a modern optimization lens
- Deterministic matrices matching the compressed sensing phase transitions of Gaussian random matrices
This page was built for publication: Uncertainty principles and ideal atomic decomposition
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4544728)