Sparse regression using mixed norms (Q734328)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Sparse regression using mixed norms |
scientific article |
Statements
Sparse regression using mixed norms (English)
0 references
20 October 2009
0 references
The article deals mixed norms that are suitable to solve regression problems in order to exploit both structure and sparsity. Regression problems are reformulated as optimization problems. Multi-layered expansion on unions of dictionaries of signals are considered. These expansions are performed using an exact reconstruction constraint, through a modified FOCUSS algorithm. The paper is organized in six sections. Section 1 is devoted to the introduction, the presentation of the problem and the definition of mixed norms. Section 2 deals with properties of mixed norms. In Section 3 the author explains the problem of signal estimation subject to equality constraints. The algorithm is presented, convergence results are studied and extension to multi-layered expansion is developed. The infinite-dimensional case is considered in Section 4, that is related to the signal estimation in the presence of noise. Section 5 gives illustrations of the algorithms previously introduced and the influence of mixed norms. Finally conclusions and outlooks are mentioned in Section 6. The article is difficult to read for people who are not experienced on this topic. In general, the author explains in detail the problems and the proofs of the theorems (some of them are given in the Appendix). Besides that, the author makes an effort to write in a self-contained way. Particularly, the solution of the problem in Section 4 is very well explained. Each algorithm presented in this paper is supported by theorems and convergence results. The examples to illustrate the developed ideas are related to the field of audio signal processing. These techniques could be adapted to image processing problems in a straightforward manner. In the conclusions it is shown an interesting table with comparisons of advantages and drawbacks of the algorithms. Finally, some extensions for the method are suggested as a future work.
0 references
sparse regression
0 references
structured regression
0 references
mixed norms
0 references
FOCUSS
0 references