A new convergence proof for the higher-order power method and generalizations
From MaRDI portal
Publication:2941672
zbMath1339.65054arXiv1407.4586MaRDI QIDQ2941672
Publication date: 21 August 2015
Full work available at URL: https://arxiv.org/abs/1407.4586
global convergencetensorsalternating least squares algorithmLojasiewicz inequalityrank-one approximationhigher-order power method
Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Numerical solutions to overdetermined systems, pseudoinverses (65F20) Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Multilinear algebra, tensor calculus (15A69)
Related Items (26)
Convergence rate analysis for the higher order power method in best rank one approximations of tensors ⋮ Robust Eigenvectors of Symmetric Tensors ⋮ Numerical approximation of Poisson problems in long domains ⋮ On the spectral problem for trivariate functions ⋮ The Optimization Landscape for Fitting a Rank-2 Tensor with a Rank-1 Tensor ⋮ Tensor Canonical Correlation Analysis With Convergence and Statistical Guarantees ⋮ Alternating Least Squares as Moving Subspace Correction ⋮ Convergence Analysis on SS-HOPM for BEC-Like Nonlinear Eigenvalue Problems ⋮ Linear convergence of an alternating polar decomposition method for low rank orthogonal tensor approximations ⋮ Jacobi-type algorithms for homogeneous polynomial optimization on Stiefel manifolds with applications to tensor approximations ⋮ Convergence of Gradient-Based Block Coordinate Descent Algorithms for Nonorthogonal Joint Approximate Diagonalization of Matrices ⋮ Unnamed Item ⋮ Quantifying measurement-induced disturbance to distinguish correlations as classical or quantum ⋮ On global convergence of alternating least squares for tensor approximation ⋮ Approximate Matrix and Tensor Diagonalization by Unitary Transformations: Convergence of Jacobi-Type Algorithms ⋮ The Epsilon-Alternating Least Squares for Orthogonal Low-Rank Tensor Approximation and Its Global Convergence ⋮ Tensor networks and hierarchical tensors for the solution of high-dimensional partial differential equations ⋮ Finding a low-rank basis in a matrix subspace ⋮ Greedy low-rank approximation in Tucker format of solutions of tensor linear systems ⋮ On the convergence of higher-order orthogonal iteration ⋮ The point-wise convergence of shifted symmetric higher order power method ⋮ Low-Rank Approximation and Completion of Positive Tensors ⋮ Quadratic optimization with orthogonality constraint: explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Globally Convergent Jacobi-Type Algorithms for Simultaneous Orthogonal Symmetric Tensor Diagonalization
This page was built for publication: A new convergence proof for the higher-order power method and generalizations