Variants of Alternating Least Squares Tensor Completion in the Tensor Train Format
From MaRDI portal
Publication:3447473
DOI10.1137/130942401zbMath1327.15054arXiv1509.00311OpenAlexW2160791191MaRDI QIDQ3447473
Sebastian Krämer, Melanie Kluge, Lars Grasedyck
Publication date: 27 October 2015
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1509.00311
Related Items (max. 100)
Solving phase-field models in the tensor train format to generate microstructures of bicontinuous composites ⋮ Alternating Linear Scheme in a Bayesian Framework for Low-Rank Tensor Approximation ⋮ Chebfun in Three Dimensions ⋮ Variational Bayesian inference for CP tensor completion with subspace information ⋮ An Adaptive Stochastic Galerkin Tensor Train Discretization for Randomly Perturbed Domains ⋮ Tensor Completion via Gaussian Process--Based Initialization ⋮ Block tensor train decomposition for missing data estimation ⋮ Low-Rank Tensor Recovery using Sequentially Optimal Modal Projections in Iterative Hard Thresholding (SeMPIHT) ⋮ Tensor Algorithms for Advanced Sensitivity Metrics ⋮ Tensor train rank minimization with nonlocal self-similarity for tensor completion ⋮ Gradient-based optimization for regression in the functional tensor-train format ⋮ Non-intrusive tensor reconstruction for high-dimensional random PDEs ⋮ Stable als approximation in the TT-format for rank-adaptive tensor completion ⋮ New Riemannian Preconditioned Algorithms for Tensor Completion via Polyadic Decomposition ⋮ Tensor Bi-CR Methods for Solutions of High Order Tensor Equation Accompanied by Einstein Product
Cites Work
- Tensor Decompositions and Applications
- Tensor-Train Decomposition
- TT-cross approximation for multidimensional arrays
- Low-rank tensor completion by Riemannian optimization
- Black box approximation of tensors in hierarchical Tucker format
- Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm
- Learning with tensors: a framework based on convex optimization and spectral regularization
- A new scheme for the tensor representation
- Optimization on the hierarchical Tucker manifold - applications to tensor completion
- Low rank tensor recovery via iterative hard thresholding
- On Local Convergence of Alternating Schemes for Optimization of Convex Problems in the Tensor Train Format
- The Alternating Linear Scheme for Tensor Optimization in the Tensor Train Format
- Hierarchical Singular Value Decomposition of Tensors
- Tensor completion and low-n-rank tensor recovery via convex optimization
- Sparse tensor discretizations of high-dimensional parametric and stochastic PDEs
- Tensor Spaces and Numerical Tensor Calculus
- Breaking the Curse of Dimensionality, Or How to Use SVD in Many Dimensions
This page was built for publication: Variants of Alternating Least Squares Tensor Completion in the Tensor Train Format