Successive Rank-One Approximations for Nearly Orthogonally Decomposable Symmetric Tensors
Publication:3456878
DOI10.1137/15M1010890zbMath1330.15030arXiv1705.10404OpenAlexW3100283963MaRDI QIDQ3456878
Donald Goldfarb, Daniel Hsu, Cun Mu
Publication date: 9 December 2015
Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1705.10404
perturbation analysistensor decompositionorthogonally decomposable tensorrank-1 tensor approximation
Factor analysis and principal components; correspondence analysis (62H25) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Eigenvalues, singular values, and eigenvectors (15A18) Multilinear algebra, tensor calculus (15A69)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Tensor Decompositions and Applications
- An unconstrained optimization approach for finding real eigenvalues of even order symmetric tensors
- Multiarray signal processing: tensor decomposition meets compressed sensing
- Three-way arrays: rank and uniqueness of trilinear decompositions, with application to arithmetic complexity and statistics
- Independent component analysis, a new concept?
- Semidefinite programming relaxations for semialgebraic problems
- Tensor principal component analysis via convex optimization
- Properties and methods for finding the best rank-one approximation to higher-order tensors
- Degeneracy in Candecomp/Parafac and Indscal explained for several three-sliced arrays with a two-valued typical rank
- Global Optimization with Polynomials and the Problem of Moments
- Orthogonal Tensor Decompositions
- Rank-One Approximation to High Order Tensors
- On the Best Rank-1 Approximation of Higher-Order Supersymmetric Tensors
- Maximum Block Improvement and Polynomial Optimization
- Tensor decompositions for learning latent variable models
- Semidefinite Relaxations for Best Rank-1 Tensor Approximations
- Dictionary Learning and Tensor Decomposition via the Sum-of-Squares Method
- A sequential subspace projection method for extreme Z-eigenvalues of supersymmetric tensors
- Shifted Power Method for Computing Tensor Eigenpairs
- On the successive supersymmetric rank-1 decomposition of higher-order supersymmetric tensors
- GloptiPoly 3: moments, optimization and semidefinite programming
- Biquadratic Optimization Over Unit Spheres and Semidefinite Programming Relaxations
- An approach to obtaining global extremums in polynomial mathematical programming problems
- Numerical Optimization
- A Counterexample to the Possibility of an Extension of the Eckart--Young Low-Rank Approximation Theorem for the Orthogonal Rank Tensor Decomposition
- On the Best Rank-1 and Rank-(R1 ,R2 ,. . .,RN) Approximation of Higher-Order Tensors
- The Best Rank-1 Approximation of a Symmetric Tensor and Related Spherical Optimization Problems
- Low-Rank Approximation of Generic $p \timesq \times2$ Arrays and Diverging Components in the Candecomp/Parafac Model
- Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
- Blind Multilinear Identification
- Most Tensor Problems Are NP-Hard
- The Rotation of Eigenvectors by a Perturbation. III
- Subtracting a best rank-1 approximation may increase tensor rank