Dictionary learning and tensor decomposition via the sum-of-squares method
From MaRDI portal
Publication:2941498
Abstract: We give a new approach to the dictionary learning (also known as "sparse coding") problem of recovering an unknown matrix (for ) from examples of the form [ y = Ax + e, ] where is a random vector in with at most nonzero coordinates, and is a random noise vector in with bounded magnitude. For the case , our algorithm recovers every column of within arbitrarily good constant accuracy in time , in particular achieving polynomial time if for any , and time if is (a sufficiently small) constant. Prior algorithms with comparable assumptions on the distribution required the vector to be much sparser---at most nonzero coordinates---and there were intrinsic barriers preventing these algorithms from applying for denser . We achieve this by designing an algorithm for noisy tensor decomposition that can recover, under quite general conditions, an approximate rank-one decomposition of a tensor , given access to a tensor that is -close to in the spectral norm (when considered as a matrix). To our knowledge, this is the first algorithm for tensor decomposition that works in the constant spectral-norm noise regime, where there is no guarantee that the local optima of and have similar structures. Our algorithm is based on a novel approach to using and analyzing the Sum of Squares semidefinite programming hierarchy (Parrilo 2000, Lasserre 2001), and it can be viewed as an indication of the utility of this very general and powerful tool for unsupervised learning problems.
Recommendations
- Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
- Noisy tensor completion via the sum-of-squares hierarchy
- Decomposing overcomplete 3rd order tensors using sum-of-squares algorithms
- Spectral algorithms for tensor completion
- Smoothed analysis of tensor decompositions
Cites work
- Approximate distance oracles
- Approximate distance oracles with constant query time
- Automata, Languages and Programming
- Distance Oracles for Unweighted Graphs: Breaking the Quadratic Barrier with Constant Additive Error
- Fast Algorithms for Constructing t-Spanners and Paths with Stretch t
- Fast C-K-R partitions of sparse graphs
- Near-Linear Time Construction of Sparse Neighborhood Covers
- On approximate distance labels and routing schemes with affine stretch
- On sparse spanners of weighted graphs
- Ramsey partitions and proximity data structures
- Scale-oblivious metric fragmentation and the nonlinear Dvoretzky theorem
- Shortest-path queries in static networks
Cited in
(31)- Optimization over the Boolean hypercube via sums of nonnegative circuit polynomials
- Notes on computational-to-statistical gaps: predictions using statistical physics
- A nearly tight sum-of-squares lower bound for the planted clique problem
- Convergence radius and sample complexity of ITKM algorithms for dictionary learning
- Learning semidefinite regularizers
- Compressed dictionary learning
- Smoothed analysis of tensor decompositions
- Sum-of-squares certificates for maxima of random tensors on the sphere
- Noisy tensor completion via the sum-of-squares hierarchy
- On the optimization landscape of tensor decompositions
- Successive Rank-One Approximations for Nearly Orthogonally Decomposable Symmetric Tensors
- Cardinality minimization, constraints, and regularization: a survey
- scientific article; zbMATH DE number 7307468 (Why is no real title available?)
- Smoothed analysis for tensor methods in unsupervised learning
- Recursive Least Squares Dictionary Learning Algorithm
- Sum of squares lower bounds from symmetry and a good story
- Dictionary Learning on Grassmann Manifolds
- Tensor Dictionary Learning for Positive Definite Matrices
- Sum-of-squares bounds via Boolean function analysis
- Unique sharp local minimum in \(\ell_1\)-minimization complete dictionary learning
- scientific article; zbMATH DE number 7378399 (Why is no real title available?)
- Sum of Squares Bounds for the Empty Integral Hull Problem
- Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
- \((L_r,L_r,1)\)-decompositions, sparse component analysis, and the blind separation of sums of exponentials
- Fast overcomplete dictionary construction with probabilistic guarantees
- Average-case complexity of tensor decomposition for low-degree polynomials
- Learning polynomial transformations via generalized tensor decompositions
- Minimax Lower Bounds on Dictionary Learning for Tensor Data
- scientific article; zbMATH DE number 7561507 (Why is no real title available?)
- Applied harmonic analysis and data processing. Abstracts from the workshop held March 25--31, 2018
- Sums of squares and sparse semidefinite programming
This page was built for publication: Dictionary learning and tensor decomposition via the sum-of-squares method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2941498)