Content-aware compressive sensing recovery using Laplacian scale mixture priors and side information (Q1721345)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Content-aware compressive sensing recovery using Laplacian scale mixture priors and side information |
scientific article |
Statements
Content-aware compressive sensing recovery using Laplacian scale mixture priors and side information (English)
0 references
8 February 2019
0 references
Summary: Nonlocal methods have shown great potential in many image restoration tasks including compressive sensing (CS) reconstruction through use of image self-similarity prior. However, they are still limited in recovering fine-scale details and sharp features, when rich repetitive patterns cannot be guaranteed; moreover the CS measurements are corrupted. In this paper, we propose a novel CS recovery algorithm that combines nonlocal sparsity with local and global prior, which soften and complement the self-similarity assumption for irregular structures. First, a Laplacian scale mixture (LSM) prior is utilized to model dependencies among similar patches. For achieving group sparsity, each singular value of similar packed patches is modeled as a Laplacian distribution with a variable scale parameter. Second, a global prior and a compensation-based sparsity prior of local patch are designed in order to maintain differences between packed patches. The former refers to a prediction which integrates the information at the independent processing stage and is used as side information, while the latter enforces a small (i.e., sparse) prediction error and is also modeled with the LSM model so as to obtain local sparsity. Afterward, we derive an efficient algorithm based on the expectation-maximization (EM) and approximate message passing (AMP) frame for the maximum a posteriori (MAP) estimation of the sparse coefficients. Numerical experiments show that the proposed method outperforms many CS recovery algorithms.
0 references
0 references
0 references
0 references
0 references
0 references
0 references