On polynomial time methods for exact low-rank tensor completion
From MaRDI portal
Publication:2007852
DOI10.1007/s10208-018-09408-6zbMath1436.15031arXiv1702.06980OpenAlexW2593198395MaRDI QIDQ2007852
Publication date: 22 November 2019
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1702.06980
convex optimizationmatrix completionGrassmannianconcentration inequalityspectral algorithmtensor completionpolynomial time complexitygradient descent algorithm
Analysis of algorithms and problem complexity (68Q25) Convex programming (90C25) Approximation methods and heuristics in mathematical programming (90C59) Matrix completion problems (15A83)
Related Items
An optimal statistical and computational framework for generalized tensor estimation, Inference for low-rank tensors -- no need to debias, Statistically optimal and computationally efficient low rank tensor completion from noisy entries, Tensor completion by multi-rank via unitary transformation, Generalized Low-Rank Plus Sparse Tensor Estimation by Fast Riemannian Optimization, Covariate-Assisted Sparse Tensor Completion, Latent Space Model for Higher-Order Networks and Generalized Tensor Decomposition, Factor Models for High-Dimensional Tensor Time Series, Statistical inference for structured high-dimensional models. Abstracts from the workshop held March 11--17, 2018, Deterministic Tensor Completion with Hypergraph Expanders, The Sup-norm Perturbation of HOSVD and Low Rank Tensor Denoising, Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees, Community detection on mixture multilayer networks via regularized tensor decomposition, Riemannian conjugate gradient descent method for fixed multi rank third-order tensor completion, ISLET: Fast and Optimal Low-Rank Tensor Regression via Importance Sketching
Cites Work
- Unnamed Item
- Unnamed Item
- On tensor completion via nuclear norm minimization
- Low-rank tensor completion by Riemannian optimization
- User-friendly tail bounds for sums of random matrices
- Linear and nonlinear programming.
- A Bennett concentration inequality and its application to suprema of empirical processes
- Decoupling inequalities for the tail probabilities of multivariate \(U\)- statistics
- Tensor theta norms and low rank recovery
- Noisy tensor completion via the sum-of-squares hierarchy
- Low rank tensor recovery via iterative hard thresholding
- Exact matrix completion via convex optimization
- Tensor decompositions for learning latent variable models
- Tensor completion and low-n-rank tensor recovery via convex optimization
- A Newton–Grassmann Method for Computing the Best Multilinear Rank-$(r_1,$ $r_2,$ $r_3)$ Approximation of a Tensor
- The Geometry of Algorithms with Orthogonality Constraints
- Incoherent Tensor Norms and Their Applications in Higher Order Tensor Completion
- Tensor Algebra and Multidimensional Harmonic Retrieval in Signal Processing for MIMO Radar
- Tensor-Based Formulation and Nuclear Norm Regularization for Multienergy Computed Tomography
- Spectral Algorithms for Tensor Completion
- Quasi-Newton Methods on Grassmannians and Multilinear Approximations of Tensors
- A useful variant of the Davis–Kahan theorem for statisticians
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- Matrix Completion From a Few Entries
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
- Most Tensor Problems Are NP-Hard
- A Simpler Approach to Matrix Completion