Best nonnegative rank-one approximations of tensors

From MaRDI portal
Publication:5203971

DOI10.1137/18M1224064zbMATH Open1454.90047arXiv1810.13372OpenAlexW2992276872WikidataQ114074276 ScholiaQ114074276MaRDI QIDQ5203971FDOQ5203971


Authors: Defeng Sun, Kim-Chuan Toh, Shenglong Hu Edit this on Wikidata


Publication date: 9 December 2019

Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)

Abstract: In this paper, we study the polynomial optimization problem of multi-forms over the intersection of the multi-spheres and the nonnegative orthants. This class of problems is NP-hard in general, and includes the problem of finding the best nonnegative rank-one approximation of a given tensor. A Positivstellensatz is given for this class of polynomial optimization problems, based on which a globally convergent hierarchy of doubly nonnegative (DNN) relaxations is proposed. A (zero-th order) DNN relaxation method is applied to solve these problems, resulting in linear matrix optimization problems under both the positive semidefinite and nonnegative conic constraints. A worst case approximation bound is given for this relaxation method. Then, the recent solver SDPNAL+ is adopted to solve this class of matrix optimization problems. Typically, the DNN relaxations are tight, and hence the best nonnegative rank-one approximation of a tensor can be revealed frequently. Extensive numerical experiments show that this approach is quite promising.


Full work available at URL: https://arxiv.org/abs/1810.13372




Recommendations




Cites Work


Cited In (15)

Uses Software





This page was built for publication: Best nonnegative rank-one approximations of tensors

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5203971)