The \(\ell_{2,q}\) regularized group sparse optimization: lower bound theory, recovery bound and algorithms (Q778013)

From MaRDI portal





scientific article; zbMATH DE number 7216377
Language Label Description Also known as
default for all languages
No label defined
    English
    The \(\ell_{2,q}\) regularized group sparse optimization: lower bound theory, recovery bound and algorithms
    scientific article; zbMATH DE number 7216377

      Statements

      The \(\ell_{2,q}\) regularized group sparse optimization: lower bound theory, recovery bound and algorithms (English)
      0 references
      0 references
      0 references
      0 references
      30 June 2020
      0 references
      This paper deals with the classical problem of signal recovery, namely to restore a vector \(x^{0}\in \mathbb{R}^{N}\) from a noisy observation \(d\). This problem is modeled as \(d=Ax^0+\xi\) where \(A\in \mathbb{R}^{M\times N}\) is the measurement matrix and \(\xi \in \mathbb{R}^M\) represents the measurement error. The authors consider the case where the data to be restored have a natural structure. In particular, they assume that vectors \(x=(x_1,x_2,\ldots ,x_N)\in \mathbb{R}^N\) have a predefined group structure \(x=(x_{\mathcal{G}_1},x_{\mathcal{G}_2},\ldots ,x_{\mathcal{G}_n}),\) where \(\mathcal{G}_i\subset \{1,\ldots ,n\}\) and \(x_{\mathcal{G}_i}\) denotes the \(i\)-th group of \(x\) for \(i\in \{1,2,\ldots,n\}\). In this case, the group sparsity of \(x\) is better measured by a nonconvex \(\ell _{p,q}\) ``norm'', defined by \(||x||_{p,q}:=\left( \sum_{i=1}^n ||x_{\mathcal{G}i}||_p^q \right)^{1/q},\) where \(p\geq 1\) and \(q\in (0,1)\). The authors consider the case \(p=2\) and focus on the minimization of the \(\ell _{p,q}\) regularization: \( \underset{x\in \mathbb{R}^{N}}{\mathrm{min}}\,\left\{||Ax-d||^{2}+||x||_{2,q}^{q}\right\}.\) They provide sufficient conditions for stationary points to be local minimizers and give a uniform lower bound for nonzero groups of local minimizers. They also propose a new IRLS (iteratively reweighted least square) algorithm with thresholding together with an error bound analysis. This new algorithm compares favorably with the classical IRLS algorithm with smoothing in both global convergence and computational efficiency. The paper also contains some numerical experiments.
      0 references
      0 references
      group sparsity
      0 references
      global convergence
      0 references
      recovery bound
      0 references
      lower bound IRLS
      0 references
      signal recovery
      0 references
      0 references
      0 references
      0 references

      Identifiers