Riemannian preconditioned coordinate descent for low multi-linear rank approximation
From MaRDI portal
Publication:6504649
arXiv2109.01632MaRDI QIDQ6504649FDOQ6504649
Authors: Mohammad Abu Hamed, Reshad Hosseini
Abstract: This paper presents a fast, memory efficient, optimization-based, first-order method for low multi-linear rank approximation of high-order, high-dimensional tensors. In our method, we exploit the second-order information of the cost function and the constraints to suggest a new Riemannian metric on the Grassmann manifold. We use a Riemmanian coordinate descent method for solving the problem, and also provide a local convergence analysis matching that of the coordinate descent method in the Euclidean setting. We also show that each step of our method with unit step-size is actually a step of the orthogonal iteration algorithm. Experimental results show the computational advantage of our method for high-dimensional tensors.
Has companion code repository: https://github.com/utvisionlab/rpcd
Multilinear algebra, tensor calculus (15A69) Nonconvex programming, global optimization (90C26) Manifolds of metrics (especially Riemannian) (58D17)
This page was built for publication: Riemannian preconditioned coordinate descent for low multi-linear rank approximation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6504649)