Practical approximation algorithms for _1-regularized sparse rank-1 approximation to higher-order tensors

From MaRDI portal
Publication:6124341

DOI10.1007/S11590-023-02032-6arXiv2207.07383OpenAlexW4382404638MaRDI QIDQ6124341FDOQ6124341


Authors: Xianpeng Mao, Yuning Yang Edit this on Wikidata


Publication date: 27 March 2024

Published in: Optimization Letters (Search for Journal in Brave)

Abstract: Two approximation algorithms are proposed for ell1-regularized sparse rank-1 approximation to higher-order tensors. The algorithms are based on multilinear relaxation and sparsification, which are easily implemented and well scalable. In particular, the second one scales linearly with the size of the input tensor. Based on a careful estimation of the ell1-regularized sparsification, theoretical approximation lower bounds are derived. Our theoretical results also suggest an explicit way of choosing the regularization parameters. Numerical examples are provided to verify the proposed algorithms.


Full work available at URL: https://arxiv.org/abs/2207.07383




Recommendations




Cites Work






This page was built for publication: Practical approximation algorithms for \(\ell_1\)-regularized sparse rank-1 approximation to higher-order tensors

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6124341)