Alternating Linear Scheme in a Bayesian Framework for Low-Rank Tensor Approximation
From MaRDI portal
Publication:5075697
Abstract: Multiway data often naturally occurs in a tensorial format which can be approximately represented by a low-rank tensor decomposition. This is useful because complexity can be significantly reduced and the treatment of large-scale data sets can be facilitated. In this paper, we find a low-rank representation for a given tensor by solving a Bayesian inference problem. This is achieved by dividing the overall inference problem into sub-problems where we sequentially infer the posterior distribution of one tensor decomposition component at a time. This leads to a probabilistic interpretation of the well-known iterative algorithm alternating linear scheme (ALS). In this way, the consideration of measurement noise is enabled, as well as the incorporation of application-specific prior knowledge and the uncertainty quantification of the low-rank tensor estimate. To compute the low-rank tensor estimate from the posterior distributions of the tensor decomposition components, we present an algorithm that performs the unscented transform in tensor train format.
Recommendations
- Efficient alternating least squares algorithms for low multilinear rank approximation of tensors
- Rank Regularization and Bayesian Inference for Tensor Completion and Extrapolation
- Nonconvex Robust Low-Rank Tensor Reconstruction via an Empirical Bayes Method
- Orthogonal low rank tensor approximation: alternating least squares method and its global convergence
- Alternating direction method of multipliers for generalized low-rank tensor recovery
- Optimal low-rank approximations of Bayesian linear inverse problems
- Bayesian tensor regression
- Low-rank nonnegative tensor approximation via alternating projections and sketching
- Bayesian factorizations of big sparse tensors
- Low-rank approximation of tensors
Cites work
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A constructive arbitrary-degree Kronecker product decomposition of tensors.
- A kernel-based framework to tensorial data analysis
- Alternating minimal energy methods for linear systems in higher dimensions
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- Approximation of \(2^d\times2^d\) matrices using tensor decomposition
- Bayesian filtering and smoothing
- Equivariant and scale-free Tucker decomposition models
- Nonlinear system modeling and identification using Volterra‐PARAFAC models
- On local convergence of alternating schemes for optimization of convex problems in the tensor train format
- Probabilistic Tensor Canonical Polyadic Decomposition With Orthogonal Factors
- Solution of Linear Systems and Matrix Inversion in the TT-Format
- Stable, Robust, and Super Fast Reconstruction of Tensors Using Multi-Way Projections
- Tensor Decomposition for Signal Processing and Machine Learning
- Tensor Decompositions and Applications
- Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives
- Tensor network alternating linear scheme for MIMO Volterra system identification
- Tensor networks for dimensionality reduction and large-scale optimization. I: Low-rank tensor decompositions
- Tensor-train decomposition
- The alternating linear scheme for tensor optimization in the tensor train format
- The density-matrix renormalization group in the age of matrix product states
- Variants of alternating least squares tensor completion in the tensor train format
Cited in
(2)
This page was built for publication: Alternating Linear Scheme in a Bayesian Framework for Low-Rank Tensor Approximation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5075697)