Dictionary learning and tensor decomposition via the sum-of-squares method
From MaRDI portal
Publication:2941498
DOI10.1145/2746539.2746605zbMATH Open1321.68396arXiv1407.1543OpenAlexW2053885701MaRDI QIDQ2941498FDOQ2941498
Authors: Boaz Barak, Jonathan Kelner, David Steurer
Publication date: 21 August 2015
Published in: Proceedings of the forty-seventh annual ACM symposium on Theory of Computing (Search for Journal in Brave)
Abstract: We give a new approach to the dictionary learning (also known as "sparse coding") problem of recovering an unknown matrix (for ) from examples of the form [ y = Ax + e, ] where is a random vector in with at most nonzero coordinates, and is a random noise vector in with bounded magnitude. For the case , our algorithm recovers every column of within arbitrarily good constant accuracy in time , in particular achieving polynomial time if for any , and time if is (a sufficiently small) constant. Prior algorithms with comparable assumptions on the distribution required the vector to be much sparser---at most nonzero coordinates---and there were intrinsic barriers preventing these algorithms from applying for denser . We achieve this by designing an algorithm for noisy tensor decomposition that can recover, under quite general conditions, an approximate rank-one decomposition of a tensor , given access to a tensor that is -close to in the spectral norm (when considered as a matrix). To our knowledge, this is the first algorithm for tensor decomposition that works in the constant spectral-norm noise regime, where there is no guarantee that the local optima of and have similar structures. Our algorithm is based on a novel approach to using and analyzing the Sum of Squares semidefinite programming hierarchy (Parrilo 2000, Lasserre 2001), and it can be viewed as an indication of the utility of this very general and powerful tool for unsupervised learning problems.
Full work available at URL: https://arxiv.org/abs/1407.1543
Recommendations
- Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
- Noisy tensor completion via the sum-of-squares hierarchy
- Decomposing overcomplete 3rd order tensors using sum-of-squares algorithms
- Spectral algorithms for tensor completion
- Smoothed analysis of tensor decompositions
Cites Work
- Shortest-path queries in static networks
- Approximate distance oracles
- On sparse spanners of weighted graphs
- Scale-oblivious metric fragmentation and the nonlinear Dvoretzky theorem
- On approximate distance labels and routing schemes with affine stretch
- Distance Oracles for Unweighted Graphs: Breaking the Quadratic Barrier with Constant Additive Error
- Fast Algorithms for Constructing t-Spanners and Paths with Stretch t
- Near-Linear Time Construction of Sparse Neighborhood Covers
- Approximate distance oracles with constant query time
- Fast C-K-R partitions of sparse graphs
- Automata, Languages and Programming
- Ramsey partitions and proximity data structures
Cited In (31)
- Notes on computational-to-statistical gaps: predictions using statistical physics
- Convergence radius and sample complexity of ITKM algorithms for dictionary learning
- Learning semidefinite regularizers
- Compressed dictionary learning
- Smoothed analysis of tensor decompositions
- Sum-of-squares certificates for maxima of random tensors on the sphere
- Successive Rank-One Approximations for Nearly Orthogonally Decomposable Symmetric Tensors
- Noisy tensor completion via the sum-of-squares hierarchy
- On the optimization landscape of tensor decompositions
- Cardinality minimization, constraints, and regularization: a survey
- Title not available (Why is that?)
- Smoothed analysis for tensor methods in unsupervised learning
- Recursive Least Squares Dictionary Learning Algorithm
- Sum of squares lower bounds from symmetry and a good story
- Dictionary Learning on Grassmann Manifolds
- Tensor Dictionary Learning for Positive Definite Matrices
- Sum-of-squares bounds via Boolean function analysis
- A Nearly Tight Sum-of-Squares Lower Bound for the Planted Clique Problem
- Unique sharp local minimum in \(\ell_1\)-minimization complete dictionary learning
- Title not available (Why is that?)
- Sum of Squares Bounds for the Empty Integral Hull Problem
- Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
- \((L_r,L_r,1)\)-decompositions, sparse component analysis, and the blind separation of sums of exponentials
- Average-case complexity of tensor decomposition for low-degree polynomials
- Learning polynomial transformations via generalized tensor decompositions
- Fast overcomplete dictionary construction with probabilistic guarantees
- Minimax Lower Bounds on Dictionary Learning for Tensor Data
- Title not available (Why is that?)
- Sums of squares and sparse semidefinite programming
- Applied harmonic analysis and data processing. Abstracts from the workshop held March 25--31, 2018
- Optimization over the Boolean hypercube via sums of nonnegative circuit polynomials
Uses Software
This page was built for publication: Dictionary learning and tensor decomposition via the sum-of-squares method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2941498)