A Riemannian gradient ascent algorithm with applications to orthogonal approximation problems of symmetric tensors
DOI10.1016/j.apnum.2022.08.005zbMath1503.65130OpenAlexW4293014333WikidataQ113880028 ScholiaQ113880028MaRDI QIDQ2085661
Weiwei Yang, Zhou Sheng, Jie Wen
Publication date: 18 October 2022
Published in: Applied Numerical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.apnum.2022.08.005
global convergencesymmetric tensortensor approximationRiemannian gradientŁojasiewicz gradient inequality
Numerical mathematical programming methods (65K05) Multilinear algebra, tensor calculus (15A69) Numerical linear algebra (65F99)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A feasible method for optimization with orthogonality constraints
- Tensor Decompositions and Applications
- A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions
- Canonical polyadic decomposition of third-order tensors: relaxed uniqueness conditions and algebraic algorithm
- A splitting method for orthogonality constrained problems
- Differential-geometric Newton method for the best rank-\((R _{1}, R _{2}, R _{3})\) approximation of tensors
- Independent component analysis, a new concept?
- Tensor eigenvalues and their applications
- Monotonically convergent algorithms for symmetric tensor approximation
- Krylov-type methods for tensor computations.I
- An inexact augmented Lagrangian method for computing strongly orthogonal decompositions of tensors
- Orthogonal Tensor Decompositions
- Rank-One Approximation to High Order Tensors
- On the Best Rank-1 Approximation of Higher-Order Supersymmetric Tensors
- Jacobi Algorithm for the Best Low Multilinear Rank Approximation of Symmetric Tensors
- Projection-like Retractions on Matrix Manifolds
- Manopt, a Matlab toolbox for optimization on manifolds
- Semidefinite Relaxations for Best Rank-1 Tensor Approximations
- Convergence Results for Projected Line-Search Methods on Varieties of Low-Rank Matrices Via Łojasiewicz Inequality
- Best Low Multilinear Rank Approximation of Higher-Order Tensors, Based on the Riemannian Trust-Region Scheme
- Globally Convergent Jacobi-Type Algorithms for Simultaneous Orthogonal Symmetric Tensor Diagonalization
- A Riemannian BFGS Method for Nonconvex Optimization Problems
- Shifted Power Method for Computing Tensor Eigenpairs
- On the Tensor SVD and the Optimal Low Rank Orthogonal Approximation of Tensors
- Two-Point Step Size Gradient Methods
- Numerical Optimization
- A Multilinear Singular Value Decomposition
- On the Best Rank-1 and Rank-(R1 ,R2 ,. . .,RN) Approximation of Higher-Order Tensors
- Symmetric orthogonal approximation to symmetric tensors with applications to image reconstruction
- A New First-Order Algorithmic Framework for Optimization Problems with Orthogonality Constraints
- Contrasts, independent component analysis, and blind deconvolution
- The Best Rank-1 Approximation of a Symmetric Tensor and Related Spherical Optimization Problems
- Approximate Matrix and Tensor Diagonalization by Unitary Transformations: Convergence of Jacobi-Type Algorithms
- Quasi-Newton Methods on Grassmannians and Multilinear Approximations of Tensors
- A new, globally convergent Riemannian conjugate gradient method
- A Jacobi-Type Method for Computing Orthogonal Tensor Decompositions
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
This page was built for publication: A Riemannian gradient ascent algorithm with applications to orthogonal approximation problems of symmetric tensors