A Newton–Grassmann Method for Computing the Best Multilinear Rank-$(r_1,$ $r_2,$ $r_3)$ Approximation of a Tensor

From MaRDI portal
Publication:3561155

DOI10.1137/070688316zbMath1205.65161OpenAlexW2036718271MaRDI QIDQ3561155

Berkant Savas, Lars Eldén

Publication date: 25 May 2010

Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1137/070688316



Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).


Related Items (39)

Efficient alternating least squares algorithms for low multilinear rank approximation of tensorsLinear algebra for tensor problemsRandomized algorithms for the approximations of Tucker and the tensor train decompositionsOptimization landscape of Tucker decompositionTopology of tensor ranksKrylov-type methods for tensor computations.IJacobi-type algorithms for homogeneous polynomial optimization on Stiefel manifolds with applications to tensor approximationsRandomized algorithms for the computation of multilinear rank-\((\mu_1,\mu_2,\mu_3)\) approximationsSome norm inequalities for commutators of contracted tensor productsA gradual rank increasing process for matrix completionTensor Approximation for Multidimensional and Multivariate DataA tensor decomposition approach to data compression and approximation of ND systemsMinimality of tensors of fixed multilinear rankVariational calculus with sums of elementary tensors of fixed rankRiemannian Modified Polak--Ribière--Polyak Conjugate Gradient Order Reduced Model by Tensor TechniquesThe Computation of Low Multilinear Rank Approximations of Tensors via Power Scheme and Random ProjectionOn manifolds of tensors of fixed TT-rankRandomized algorithms for the low multilinear rank approximations of tensorsOn optimal low rank Tucker approximation for tensors: the case for an adjustable core sizeGreedy low-rank approximation in Tucker format of solutions of tensor linear systemsClassification of hyperspectral images by tensor modeling and additive morphological decompositionDiagonalization of tensors with circulant structureFast Hankel tensor–vector product and its application to exponential data fittingOn polynomial time methods for exact low-rank tensor completionGRADIENT FLOWS FOR OPTIMIZATION IN QUANTUM INFORMATION AND QUANTUM DYNAMICS: FOUNDATIONS AND APPLICATIONSNumerical tensor calculusOn the convergence of higher-order orthogonal iterationNonlinearly Preconditioned Optimization on Grassmann Manifolds for Computing Approximate Tucker Tensor DecompositionsAn efficient randomized algorithm for computing the approximate Tucker decompositionTensor neural network models for tensor singular value decompositionsEven-order Toeplitz tensor: framework for multidimensional structured linear systemsA Krylov-Schur-like method for computing the best rank-\((r_1,r_2,r_3)\) approximation of large and sparse tensorsUnnamed ItemComputing the Gradient in Optimization Algorithms for the CP Decomposition in Constant Memory through Tensor BlockingFrobenius norm inequalities of commutators based on different productsA literature survey of low-rank tensor approximation techniquesClassification of sub-Cuntz statesOptimization on the hierarchical Tucker manifold - applications to tensor completionISLET: Fast and Optimal Low-Rank Tensor Regression via Importance Sketching


Uses Software



This page was built for publication: A Newton–Grassmann Method for Computing the Best Multilinear Rank-$(r_1,$ $r_2,$ $r_3)$ Approximation of a Tensor