A Preconditioned Riemannian Gradient Descent Algorithm for Low-Rank Matrix Recovery

From MaRDI portal
Publication:6435261

arXiv2305.02543MaRDI QIDQ6435261FDOQ6435261


Authors: Fengmiao Bian, Jian-Feng Cai, Rui Zhang Edit this on Wikidata


Publication date: 4 May 2023

Abstract: The low-rank matrix recovery problem often arises in various fields, including signal processing, machine learning, and imaging science. The Riemannian gradient descent (RGD) algorithm has proven to be an efficient algorithm for solving this problem. In this paper, we present a preconditioned Riemannian gradient descent (PRGD) for low-rank matrix recovery. The preconditioner, noted for its simplicity and computational efficiency, is constructed by weighting the (i,j)-th entry of the gradient matrix according to the norms of the i-th row and the j-th column. We establish the theoretical recovery guarantee for PRGD under the restricted isometry property assumption. Experimental results indicate that PRGD can accelerate RGD by up to tenfold in solving low-rank matrix recovery problems such as matrix completion.













This page was built for publication: A Preconditioned Riemannian Gradient Descent Algorithm for Low-Rank Matrix Recovery

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6435261)