SOTT: greedy approximation of a tensor as a sum of tensor trains
From MaRDI portal
Publication:5065504
Recommendations
- Streaming Tensor Train Approximation
- The alternating linear scheme for tensor optimization in the tensor train format
- A regularized Newton method for the efficient approximation of tensors represented in the canonical tensor format
- Nonnegative tensor train factorization with DMRG technique
- Greedy low-rank approximation in Tucker format of solutions of tensor linear systems
Cites work
- A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data
- A constructive algorithm for decomposing a tensor into a finite sum of orthonormal rank-1 terms
- A literature survey of low-rank tensor approximation techniques
- Algorithms for Numerical Analysis in High Dimensions
- Alternating least squares as moving subspace correction
- Breaking the Curse of Dimensionality, Or How to Use SVD in Many Dimensions
- Canonical polyadic decomposition of third-order tensors: reduction to generalized eigenvalue decomposition
- Enhanced Line Search: A Novel Method to Accelerate PARAFAC
- Greedy algorithms and M-term approximation with regard to redundant dictionaries
- Local convergence of the alternating least squares algorithm for canonical tensor approximation
- Low rank Tucker-type tensor approximation to classical potentials
- Multigrid accelerated tensor approximation of function related multidimensional arrays
- On Uniqueness and Computation of the Decomposition of a Tensor into Multilinear Rank-$(1,L_r,L_r)$ Terms
- On accelerating the regularized alternating least-squares algorithm for tensors
- On best rank one approximation of tensors
- On local convergence of alternating schemes for optimization of convex problems in the tensor train format
- Optimization-based algorithms for tensor decompositions: canonical polyadic decomposition, decomposition in rank-\((L_r,L_r,1)\) terms, and a new generalization
- Rank-one approximation to high order tensors
- Resonant damping of flexible structures under random excitation
- Spectral tensor-train decomposition
- Tensor Decompositions and Applications
- Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
- Tensor networks for dimensionality reduction and large-scale optimization. I: Low-rank tensor decompositions
- Tensor numerical methods in scientific computing
- Tensor spaces and numerical tensor calculus
- Tensor-train decomposition
- The LATIN multiscale computational method and the proper generalized decomposition
- The number of singular vector tuples and uniqueness of best rank-one approximation of tensors
Cited in
(13)- The alternating linear scheme for tensor optimization in the tensor train format
- Higher-order principal component analysis for the approximation of tensors in tree-based low-rank formats
- Quantized CP approximation and sparse tensor interpolation of function-generated data.
- The condition number of many tensor decompositions is invariant under Tucker compression
- Performance of the low-rank TT-SVD for large dense tensors on modern multicore CPUs
- Greedy low-rank approximation in Tucker format of solutions of tensor linear systems
- Streaming Tensor Train Approximation
- Tensor train construction from tensor actions, with application to compression of large high order derivative tensors
- High-order tensor estimation via trains of coupled third-order CP and Tucker decompositions
- Black Box Approximation in the Tensor Train Format Initialized by ANOVA Decomposition
- Nonnegative tensor train factorization with DMRG technique
- Adaptive hierarchical subtensor partitioning for tensor compression
- A new scheme for the tensor representation
This page was built for publication: SOTT: greedy approximation of a tensor as a sum of tensor trains
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5065504)