Local convergence of alternating low‐rank optimization methods with overrelaxation
DOI10.1002/NLA.2459arXiv2111.14758OpenAlexW3214830908WikidataQ112878908 ScholiaQ112878908MaRDI QIDQ6133038FDOQ6133038
Authors: Ivan Oseledets, M. V. Rakhuba, André Uschmajew
Publication date: 17 August 2023
Published in: Numerical Linear Algebra with Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2111.14758
Recommendations
- Alternating least squares as moving subspace correction
- Local convergence of the alternating least squares algorithm for canonical tensor approximation
- Linear convergence of an alternating polar decomposition method for low rank orthogonal tensor approximations
- Provable accelerated gradient method for nonconvex low rank optimization
- On local convergence of alternating schemes for optimization of convex problems in the tensor train format
Iterative numerical methods for linear systems (65F10) Numerical methods for low-rank matrix approximation; matrix compression (65F55)
Cited In (1)
This page was built for publication: Local convergence of alternating low‐rank optimization methods with overrelaxation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6133038)