Sparse total least squares: analysis and greedy algorithms (Q1938584)

From MaRDI portal
Revision as of 15:40, 1 February 2024 by Import240129110113 (talk | contribs) (Added link to MaRDI item.)
scientific article
Language Label Description Also known as
English
Sparse total least squares: analysis and greedy algorithms
scientific article

    Statements

    Sparse total least squares: analysis and greedy algorithms (English)
    0 references
    0 references
    21 February 2013
    0 references
    The sparse total least squares problem (TLS) with solutions having \(s\) nonzero elements is studied. Linear systems where both the matrix and the right hand side are affected by noise or uncertainties are typically solved via a TLS. A set of conditions under which the sparse least squares and TLS problems have the same support is presented. The conditions are expressed in terms of the restricted isometry constants and the minimum nonzero principal angle between subspaces made of \(s\) columns of the matrix. Several greedy algorithms for solving the sparse TLS problem are examined. It is inferred that an effective approach is to compute the support of the LS solution, then to solve the standard TLS problem restrained to that support. Indeed, doing so with orthogonal matching pursuit (OMP) gives consistently good results. Another greedy strategy is to extend the support with the column that minimizes the smallest angle with the subspace (SAS) corresponding to the already chosen support, which is a combined LS-TLS heuristic. Via simulations it is shown that SAS gives the best results for small problem sizes and high and medium signal-to-noise ratio, while OMP is best or nearly best in almost all cases.
    0 references
    sparse total least squares
    0 references
    sparse solutions
    0 references
    greedy algorithms
    0 references
    orthogonal matching pursuit
    0 references

    Identifiers