A unifying analysis of projected gradient descent for \(\ell_p\)-constrained least squares (Q1948598)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A unifying analysis of projected gradient descent for \(\ell_p\)-constrained least squares |
scientific article |
Statements
A unifying analysis of projected gradient descent for \(\ell_p\)-constrained least squares (English)
0 references
24 April 2013
0 references
The authors study the accuracy of the projected gradient descent (PGD) algorithm in solving sparse least squares problems where sparsity is dictated by a \(\ell_p\)-norm constraint. Assuming that one has an algorithm that can find a projection of any given point onto \(\ell_p\)-balls with \(p\in[0, 1]\), it is shown that the PGD method converges to the true signal, up to a statistical precision, at a linear rate. The convergence guarantees in this paper are obtained by requiring a proper restricted isometry property of order \(s\), with restricted isometry constants \(\alpha_s\) and \(\beta_s\), \(\beta_s\|x\|_2^2\leq \|A\|_2^2\leq \alpha_s\|x\|_2^2\) to hold for the measurement matrix \(A\), for all \(s\)-sparse vectors \(x\). By varying \(p\) from zero to one, these sufficient conditions become more stringent while the robustness to noise and convergence rate worsens. This behavior suggests that smaller values of \(p\) are preferable, and, in fact, the PGD method at \(p = 0\) (i.e. the iterative hard tresholding algorithm) outperforms the PGD method at \(p > 0\) in every aspect. These conclusions, however, are not definitive as merely sufficient conditions are presented for the accuracy of the PGD method.
0 references
compressed sensing
0 references
sparsity
0 references
underdetermined linear systems
0 references
restricted isometry property
0 references
projected gradient descent algorithm
0 references
sparse least squares problems
0 references
convergence
0 references