Low-rank approximation of tensors via sparse optimization.

From MaRDI portal
Publication:4637405

DOI10.1002/NLA.2136zbMATH Open1499.65247arXiv1504.05273OpenAlexW2963431840WikidataQ114235423 ScholiaQ114235423MaRDI QIDQ4637405FDOQ4637405


Authors: Xiao Fei Wang, C. Navasca Edit this on Wikidata


Publication date: 18 April 2018

Published in: Numerical Linear Algebra with Applications (Search for Journal in Brave)

Abstract: The goal of this paper is to find a low-rank approximation for a given tensor. Specifically, we give a computable strategy on calculating the rank of a given tensor, based on approximating the solution to an NP-hard problem. In this paper, we formulate a sparse optimization problem via an l1-regularization to find a low-rank approximation of tensors. To solve this sparse optimization problem, we propose a rescaling algorithm of the proximal alternating minimization and study the theoretical convergence of this algorithm. Furthermore, we discuss the probabilistic consistency of the sparsity result and suggest a way to choose the regularization parameter for practical computation. In the simulation experiments, the performance of our algorithm supports that our method provides an efficient estimate on the number of rank-one tensor components in a given tensor. Moreover, this algorithm is also applied to surveillance videos for low-rank approximation.


Full work available at URL: https://arxiv.org/abs/1504.05273




Recommendations





Cited In (13)





This page was built for publication: Low-rank approximation of tensors via sparse optimization.

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4637405)