Low Complexity Damped Gauss--Newton Algorithms for CANDECOMP/PARAFAC

From MaRDI portal
Publication:5300549

DOI10.1137/100808034zbMath1365.65071arXiv1205.2584OpenAlexW2066392792WikidataQ60486587 ScholiaQ60486587MaRDI QIDQ5300549

Anh-Huy Phan, Petr Tichavský, Andrzej Cichocki

Publication date: 27 June 2013

Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1205.2584




Related Items

Tensor decomposition for learning Gaussian mixtures from momentsA seminorm regularized alternating least squares algorithm for canonical tensor decompositionThe Optimization Landscape for Fitting a Rank-2 Tensor with a Rank-1 TensorRank-1 Tensor Properties with Applications to a Class of Tensor Optimization ProblemsCondition numbers for the tensor rank decompositionAlternating Mahalanobis Distance Minimization for Accurate and Well-Conditioned CP DecompositionA block-randomized stochastic method with importance sampling for CP tensor decompositionOn global convergence of alternating least squares for tensor approximationA Riemannian Trust Region Method for the Canonical Tensor Rank Approximation ProblemNumerical CP decomposition of some difficult tensorsGeneralized Canonical Polyadic Tensor DecompositionRiemannian Newton optimization methods for the symmetric tensor approximation problemThe Dynamics of Swamps in the Canonical Tensor Approximation ProblemAlternate algorithms to most referenced techniques of numerical optimization to solve the symmetric rank-\(R\) approximation problem of symmetric tensorsComparison of Accuracy and Scalability of Gauss--Newton and Alternating Least Squares for CANDECOMC/PARAFAC DecompositionOn the Uniqueness and Perturbation to the Best Rank-One Approximation of a TensorComputing the Gradient in Optimization Algorithms for the CP Decomposition in Constant Memory through Tensor BlockingA literature survey of low-rank tensor approximation techniques