A new convergence proof for the higher-order power method and generalizations

From MaRDI portal
Revision as of 20:16, 3 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2941672

zbMath1339.65054arXiv1407.4586MaRDI QIDQ2941672

André Uschmajew

Publication date: 21 August 2015

Full work available at URL: https://arxiv.org/abs/1407.4586




Related Items (26)

Convergence rate analysis for the higher order power method in best rank one approximations of tensorsRobust Eigenvectors of Symmetric TensorsNumerical approximation of Poisson problems in long domainsOn the spectral problem for trivariate functionsThe Optimization Landscape for Fitting a Rank-2 Tensor with a Rank-1 TensorTensor Canonical Correlation Analysis With Convergence and Statistical GuaranteesAlternating Least Squares as Moving Subspace CorrectionConvergence Analysis on SS-HOPM for BEC-Like Nonlinear Eigenvalue ProblemsLinear convergence of an alternating polar decomposition method for low rank orthogonal tensor approximationsJacobi-type algorithms for homogeneous polynomial optimization on Stiefel manifolds with applications to tensor approximationsConvergence of Gradient-Based Block Coordinate Descent Algorithms for Nonorthogonal Joint Approximate Diagonalization of MatricesUnnamed ItemQuantifying measurement-induced disturbance to distinguish correlations as classical or quantumOn global convergence of alternating least squares for tensor approximationApproximate Matrix and Tensor Diagonalization by Unitary Transformations: Convergence of Jacobi-Type AlgorithmsThe Epsilon-Alternating Least Squares for Orthogonal Low-Rank Tensor Approximation and Its Global ConvergenceTensor networks and hierarchical tensors for the solution of high-dimensional partial differential equationsFinding a low-rank basis in a matrix subspaceGreedy low-rank approximation in Tucker format of solutions of tensor linear systemsOn the convergence of higher-order orthogonal iterationThe point-wise convergence of shifted symmetric higher order power methodLow-Rank Approximation and Completion of Positive TensorsQuadratic optimization with orthogonality constraint: explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methodsUnnamed ItemUnnamed ItemGlobally Convergent Jacobi-Type Algorithms for Simultaneous Orthogonal Symmetric Tensor Diagonalization






This page was built for publication: A new convergence proof for the higher-order power method and generalizations