Sparse approximation is provably hard under coherent dictionaries
From MaRDI portal
(Redirected from Publication:340554)
Abstract: It is well known that sparse approximation problem is extsf{NP}-hard under general dictionaries. Several algorithms have been devised and analyzed in the past decade under various assumptions on the emph{coherence} of the dictionary represented by an matrix from which a subset of column vectors is selected. All these results assume . This article is an attempt to bridge the big gap between the negative result of extsf{NP}-hardness under general dictionaries and the positive results under this restrictive assumption. In particular, it suggests that the aforementioned assumption might be asymptotically the best one can make to arrive at any efficient algorithmic result under well-known conjectures of complexity theory. In establishing the results, we make use of a new simple multilayered PCP which is tailored to give a matrix with small coherence combined with our reduction.
Recommendations
Cites work
- scientific article; zbMATH DE number 2079345 (Why is no real title available?)
- A Parallel Repetition Theorem
- A threshold of ln n for approximating set cover
- Adaptive greedy approximations
- Analysis of Orthogonal Matching Pursuit Using the Restricted Isometry Property
- Column subset selection problem is UG-hard
- Compressed sensing with coherent and redundant dictionaries
- Exponential inapproximability of selecting a maximum volume sub-matrix
- Greed is Good: Algorithmic Results for Sparse Approximation
- Greedy algorithms and \(M\)-term approximation with regard to redundant dictionaries
- On Lebesgue-type inequalities for greedy approximation
- On performance of greedy algorithms
- On the hardness of approximating minimization problems
- On the optimality of the orthogonal greedy algorithm for \(\mu\)-coherent dictionaries
- On the power of unique 2-prover 1-round games
- Optimal Inapproximability Results for MAX‐CUT and Other 2‐Variable CSPs?
- Probabilistic checking of proofs
- Proof verification and the hardness of approximation problems
- Sparse Approximate Solutions to Linear Systems
- The Orthogonal Super Greedy Algorithm and Applications in Compressed Sensing
- The hardness of approximate optima in lattices, codes, and systems of linear equations
- Weak greedy algorithms
Cited in
(6)- NP-hardness and inapproximability of sparse PCA
- Average Performance of the Sparsest Approximation Using a General Dictionary
- Approximation hardness for a class of sparse optimization problems
- On some deterministic dictionaries supporting sparsity
- A note on the hardness of sparse approximation
- scientific article; zbMATH DE number 2079345 (Why is no real title available?)
This page was built for publication: Sparse approximation is provably hard under coherent dictionaries
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q340554)