Improving M-SBL for Joint Sparse Recovery Using a Subspace Penalty

From MaRDI portal
Publication:4580973

DOI10.1109/TSP.2015.2477049zbMATH Open1395.94185arXiv1503.06679OpenAlexW1733556918MaRDI QIDQ4580973FDOQ4580973


Authors: Jong Chul Ye, Jong-Min Kim, Yoram Bresler Edit this on Wikidata


Publication date: 22 August 2018

Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)

Abstract: The multiple measurement vector problem (MMV) is a generalization of the compressed sensing problem that addresses the recovery of a set of jointly sparse signal vectors. One of the important contributions of this paper is to reveal that the seemingly least related state-of-art MMV joint sparse recovery algorithms - M-SBL (multiple sparse Bayesian learning) and subspace-based hybrid greedy algorithms - have a very important link. More specifically, we show that replacing the logdet(cdot) term in M-SBL by a rank proxy that exploits the spark reduction property discovered in subspace-based joint sparse recovery algorithms, provides significant improvements. In particular, if we use the Schatten-p quasi-norm as the corresponding rank proxy, the global minimiser of the proposed algorithm becomes identical to the true solution as pightarrow0. Furthermore, under the same regularity conditions, we show that the convergence to a local minimiser is guaranteed using an alternating minimization algorithm that has closed form expressions for each of the minimization steps, which are convex. Numerical simulations under a variety of scenarios in terms of SNR, and condition number of the signal amplitude matrix demonstrate that the proposed algorithm consistently outperforms M-SBL and other state-of-the art algorithms.


Full work available at URL: https://arxiv.org/abs/1503.06679







Cited In (2)





This page was built for publication: Improving M-SBL for Joint Sparse Recovery Using a Subspace Penalty

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4580973)