Greedy approaches to symmetric orthogonal tensor decomposition
From MaRDI portal
Publication:4588941
Abstract: Finding the symmetric and orthogonal decomposition (SOD) of a tensor is a recurring problem in signal processing, machine learning and statistics. In this paper, we review, establish and compare the perturbation bounds for two natural types of incremental rank-one approximation approaches. Numerical experiments and open questions are also presented and discussed.
Recommendations
- Successive Rank-One Approximations for Nearly Orthogonally Decomposable Symmetric Tensors
- Symmetric rank-1 approximation of symmetric high-order tensors
- On the perturbation of rank-one symmetric tensors.
- Symmetric tensor decomposition
- Monotonically convergent algorithms for symmetric tensor approximation
Cites work
- scientific article; zbMATH DE number 45789 (Why is no real title available?)
- scientific article; zbMATH DE number 1489808 (Why is no real title available?)
- A note on semidefinite programming relaxations for polynomial optimization over a single sphere
- A sequential subspace projection method for extreme Z-eigenvalues of supersymmetric tensors.
- An approach to obtaining global extremums in polynomial mathematical programming problems
- An unconstrained optimization approach for finding real eigenvalues of even order symmetric tensors
- Global optimization with polynomials and the problem of moments
- GloptiPoly 3: moments, optimization and semidefinite programming
- Independent component analysis, a new concept?
- Maximum block improvement and polynomial optimization
- Numerical Optimization
- On the Best Rank-1 and Rank-(R1 ,R2 ,. . .,RN) Approximation of Higher-Order Tensors
- On the best rank-1 approximation of higher-order supersymmetric tensors
- On the successive supersymmetric rank-1 decomposition of higher-order supersymmetric tensors
- Orthogonal tensor decompositions
- Properties and methods for finding the best rank-one approximation to higher-order tensors
- Rank-one approximation to high order tensors
- Semidefinite programming relaxations for semialgebraic problems
- Semidefinite relaxations for best rank-1 tensor approximations
- Shifted power method for computing tensor eigenpairs
- Successive Rank-One Approximations for Nearly Orthogonally Decomposable Symmetric Tensors
- Tensor decompositions for learning latent variable models
- The Rotation of Eigenvectors by a Perturbation. III
- The best rank-1 approximation of a symmetric tensor and related spherical optimization problems
- The elements of statistical learning. Data mining, inference, and prediction
Cited in
(4)- scientific article; zbMATH DE number 7415122 (Why is no real title available?)
- Successive partial-symmetric rank-one algorithms for almost unitarily decomposable conjugate partial-symmetric tensors
- Using negative curvature in solving nonlinear programs
- Gradient Descent for Symmetric Tensor Decomposition
This page was built for publication: Greedy approaches to symmetric orthogonal tensor decomposition
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4588941)