Gauss-Southwell type descent methods for low-rank matrix optimization
From MaRDI portal
Publication:6438899
arXiv2306.00897MaRDI QIDQ6438899FDOQ6438899
Authors: Guillaume Olikier, André Uschmajew, Bart Vandereycken
Publication date: 1 June 2023
Abstract: We consider gradient-related methods for low-rank matrix optimization with a smooth cost function. The methods operate on single factors of the low-rank factorization and share aspects of both alternating and Riemannian optimization. Two possible choices for the search directions based on Gauss-Southwell type selection rules are compared: one using the gradient of a factorized non-convex formulation, the other using the Riemannian gradient. While both methods provide gradient convergence guarantees that are similar to the unconstrained case, the version based on Riemannian gradient is significantly more robust with respect to small singular values and the condition number of the cost function, as illustrated by numerical experiments. As a side result of our approach, we also obtain new convergence results for the alternating least squares method.
This page was built for publication: Gauss-Southwell type descent methods for low-rank matrix optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6438899)