Low-CP-rank tensor completion via practical regularization
From MaRDI portal
Publication:2113663
Abstract: Dimension reduction techniques are often used when the high-dimensional tensor has relatively low intrinsic rank compared to the ambient dimension of the tensor. The CANDECOMP/PARAFAC (CP) tensor completion is a widely used approach to find a low-rank approximation for a given tensor. In the tensor model, an regularized optimization problem was formulated with an appropriate choice of the regularization parameter. The choice of the regularization parameter is important in the approximation accuracy. However, the emergence of the large amount of data poses onerous computational burden for computing the regularization parameter via classical approaches such as the weighted generalized cross validation (WGCV), the unbiased predictive risk estimator, and the discrepancy principle. In order to improve the efficiency of choosing the regularization parameter and leverage the accuracy of the CP tensor, we propose a new algorithm for tensor completion by embedding the flexible hybrid method into the framework of the CP tensor. The main benefits of this method include incorporating regularization automatically and efficiently, improved reconstruction and algorithmic robustness. Numerical examples from image reconstruction and model order reduction demonstrate the performance of the propose algorithm.
Recommendations
- Recovering low CP/Tucker ranked tensors, with applications in tensor completion
- Fundamental conditions for low-CP-rank tensor completion
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
- Tensor completion using total variation and low-rank matrix factorization
- Tensor Factorization for Low-Rank Tensor Completion
- Low-Tubal-Rank Tensor Completion Using Alternating Minimization
- Low-rank tensor completion by Riemannian optimization
- Low-rank tensor completion using matrix factorization based on tensor train rank and total variation
- Low-rank tensor completion based on log-det rank approximation and matrix factorization
- Low-rank tensor completion via smooth matrix factorization
Cites work
- scientific article; zbMATH DE number 3877692 (Why is no real title available?)
- scientific article; zbMATH DE number 713342 (Why is no real title available?)
- scientific article; zbMATH DE number 3298300 (Why is no real title available?)
- scientific article; zbMATH DE number 3303655 (Why is no real title available?)
- A Bidiagonalization-Regularization Procedure for Large Scale Discretizations of Ill-Posed Problems
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A bidiagonalization algorithm for solving large and sparse ill-posed systems of linear equations
- A weighted-GCV method for Lanczos-hybrid regularization
- Algorithms for Numerical Analysis in High Dimensions
- Computational Methods for Inverse Problems
- Convergence rates for greedy algorithms in reduced basis methods
- Discrete inverse problems. Insight and algorithms.
- Estimation of the mean of a multivariate normal distribution
- Flexible Krylov methods for \(\ell_p\) regularization
- Krylov methods for inverse problems: Surveying classical, and introducing new, algorithmic approaches
- Linear Support Tensor Machine With LSK Channels: Pedestrian Detection in Thermal Infrared Images
- Local convergence of the alternating least squares algorithm for canonical tensor approximation
- Low-rank approximation of tensors via sparse optimization.
- Some convergence results on the regularized alternating least-squares method for tensor decomposition
- Sparse Regularization via Convex Analysis
- Spectral Methods for Time-Dependent Problems
- Spectral Methods in MATLAB
- Tensor methods for the Boltzmann-BGK equation
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- The projected GSURE for automatic parameter tuning in iterative shrinkage methods
- Three-way arrays: rank and uniqueness of trilinear decompositions, with application to arithmetic complexity and statistics
Cited in
(8)- Recovering low CP/Tucker ranked tensors, with applications in tensor completion
- scientific article; zbMATH DE number 6474942 (Why is no real title available?)
- Iterative hard thresholding for low CP-rank tensor models
- Low-rank tensor completion using matrix factorization based on tensor train rank and total variation
- TR-STF: a fast and accurate tensor ring decomposition algorithm via defined scaled tri-factorization
- Variational Bayesian inference for CP tensor completion with subspace information
- An approximation method of CP rank for third-order tensor completion
- Low-Tubal-Rank Tensor Completion Using Alternating Minimization
This page was built for publication: Low-CP-rank tensor completion via practical regularization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2113663)