Local convergence of alternating low‐rank optimization methods with overrelaxation
From MaRDI portal
Publication:6133038
DOI10.1002/NLA.2459arXiv2111.14758OpenAlexW3214830908WikidataQ112878908 ScholiaQ112878908MaRDI QIDQ6133038FDOQ6133038
Authors: Ivan Oseledets, M. V. Rakhuba, André Uschmajew
Publication date: 17 August 2023
Published in: Numerical Linear Algebra with Applications (Search for Journal in Brave)
Abstract: The local convergence of alternating optimization methods with overrelaxation for low-rank matrix and tensor problems is established. The analysis is based on the linearization of the method which takes the form of an SOR iteration for a positive semidefinite Hessian and can be studied in the corresponding quotient geometry of equivalent low-rank representations. In the matrix case, the optimal relaxation parameter for accelerating the local convergence can be determined from the convergence rate of the standard method. This result relies on a version of Young's SOR theorem for positive semidefinite block systems.
Full work available at URL: https://arxiv.org/abs/2111.14758
Iterative numerical methods for linear systems (65F10) Numerical methods for low-rank matrix approximation; matrix compression (65F55)
Cited In (1)
This page was built for publication: Local convergence of alternating low‐rank optimization methods with overrelaxation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6133038)