On global convergence of alternating least squares for tensor approximation
DOI10.1007/s10589-022-00428-1OpenAlexW4307649553MaRDI QIDQ2696915
Publication date: 17 April 2023
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-022-00428-1
tensorglobal convergenceblock coordinate descentalternating least squarescanonical polyadic decomposition
Nonconvex programming, global optimization (90C26) Stability and convergence of numerical methods for boundary value problems involving PDEs (65N12) Rate of convergence, degree of approximation (41A25) Multilinear algebra, tensor calculus (15A69) Mathematical programming (90Cxx)
Related Items (1)
Cites Work
- Unnamed Item
- Tensor Decompositions and Applications
- Three-way arrays: rank and uniqueness of trilinear decompositions, with application to arithmetic complexity and statistics
- Some convergence results on the regularized alternating least-squares method for tensor decomposition
- Musings on multilinear fitting
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Rank-One Approximation to High Order Tensors
- On Local Convergence of Alternating Schemes for Optimization of Convex Problems in the Tensor Train Format
- Optimization-Based Algorithms for Tensor Decompositions: Canonical Polyadic Decomposition, Decomposition in Rank-$(L_r,L_r,1)$ Terms, and a New Generalization
- A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
- Local Convergence of the Alternating Least Squares Algorithm for Canonical Tensor Approximation
- On the Global Convergence of the Alternating Least Squares Method for Rank-One Approximation to Generic Tensors
- A new convergence proof for the higher-order power method and generalizations
- Shifted Power Method for Computing Tensor Eigenpairs
- Orthogonal Low Rank Tensor Approximation: Alternating Least Squares Method and Its Global Convergence
- Alternating Least Squares as Moving Subspace Correction
- New ALS Methods With Extrapolating Search Directions and Optimal Step Size for Complex-Valued Tensor Decompositions
- Tensor Decomposition for Signal Processing and Machine Learning
- On the convergence of higher-order orthogonal iteration
- Quasi-Newton Methods on Grassmannians and Multilinear Approximations of Tensors
- Low Complexity Damped Gauss--Newton Algorithms for CANDECOMP/PARAFAC
- Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
- Enhanced Line Search: A Novel Method to Accelerate PARAFAC
- Minimizing Certain Convex Functions
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
- On search directions for minimization algorithms
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Exact line and plane search for tensor optimization
This page was built for publication: On global convergence of alternating least squares for tensor approximation