Smoothed analysis of tensor decompositions
From MaRDI portal
Publication:5259595
Abstract: Low rank tensor decompositions are a powerful tool for learning generative models, and uniqueness results give them a significant advantage over matrix decomposition methods. However, tensors pose significant algorithmic challenges and tensors analogs of much of the matrix algebra toolkit are unlikely to exist because of hardness results. Efficient decomposition in the overcomplete case (where rank exceeds dimension) is particularly challenging. We introduce a smoothed analysis model for studying these questions and develop an efficient algorithm for tensor decomposition in the highly overcomplete case (rank polynomial in the dimension). In this setting, we show that our algorithm is robust to inverse polynomial error -- a crucial property for applications in learning since we are only allowed a polynomial number of samples. While algorithms are known for exact tensor decomposition in some overcomplete settings, our main contribution is in analyzing their stability in the framework of smoothed analysis. Our main technical contribution is to show that tensor products of perturbed vectors are linearly independent in a robust sense (i.e. the associated matrix has singular values that are at least an inverse polynomial). This key result paves the way for applying tensor methods to learning problems in the smoothed setting. In particular, we use it to obtain results for learning multi-view models and mixtures of axis-aligned Gaussians where there are many more "components" than dimensions. The assumption here is that the model is not adversarially chosen, formalized by a perturbation of model parameters. We believe this an appealing way to analyze realistic instances of learning problems, since this framework allows us to overcome many of the usual limitations of using tensor methods.
Recommendations
- Smoothed analysis for tensor methods in unsupervised learning
- Overcomplete order-3 tensor decomposition, blind deconvolution, and Gaussian mixture models
- Analyzing tensor power method dynamics in overcomplete regime
- Tensor decompositions for learning latent variable models
- Dictionary learning and tensor decomposition via the sum-of-squares method
Cites work
- scientific article; zbMATH DE number 5485440 (Why is no real title available?)
- scientific article; zbMATH DE number 5485574 (Why is no real title available?)
- Advances in Cryptology – CRYPTO 2004
- Answering \(n^{2+o(1)}\) counting queries with differential privacy is hard
- Bounds on the sample complexity for private learning and private data release
- Characterizing the sample complexity of private learners
- Collusion-secure fingerprinting for digital data
- Differential privacy and the fat-shattering dimension of linear queries
- Efficient algorithms for privately releasing marginals via convex relaxations
- Faster algorithms for privately releasing marginals
- Faster private release of marginals on small databases
- Interactive privacy via the median mechanism
- Iterative Constructions and Private Data Release
- Lower bounds in differential privacy
- New Efficient Attacks on Statistical Disclosure Control Mechanisms
- On the complexity of differentially private data release, efficient algorithms and hardness results
- On the geometry of differential privacy
- Our Data, Ourselves: Privacy Via Distributed Noise Generation
- Private Learning and Sanitization: Pure vs. Approximate Differential Privacy
- The price of privately releasing contingency tables and the spectra of random matrices with correlated rows
- Theory of Cryptography
Cited in
(22)- Speeding up random walk mixing by starting from a uniform vertex
- Dictionary learning and tensor decomposition via the sum-of-squares method
- On the optimization landscape of tensor decompositions
- Composition-aware spectroscopic tomography
- Effective criteria for specific identifiability of tensors and forms
- An operator theoretic approach to nonparametric mixture models
- Learning attribute patterns in high-dimensional structured latent attribute models
- Overcomplete order-3 tensor decomposition, blind deconvolution, and Gaussian mixture models
- A general framework for tensor screening through smoothing
- Smoothed analysis for tensor methods in unsupervised learning
- Homotopy techniques for tensor decomposition and perfect identifiability
- Non-convex matrix completion and related problems via strong duality
- scientific article; zbMATH DE number 7306859 (Why is no real title available?)
- Absolute reconstruction for sums of powers of linear forms: degree 3 and beyond
- Polynomial Learning of Distribution Families
- Provable ICA with unknown Gaussian noise, and implications for Gaussian mixtures and autoencoders
- Universality of the least singular value for the sum of random matrices
- Complete decomposition of symmetric tensors in linear time and polylogarithmic precision
- Concentration inequalities for random tensors
- Recovering structured probability matrices
- Planning and learning in partially observable systems via filter stability
- Analyzing tensor power method dynamics in overcomplete regime
This page was built for publication: Smoothed analysis of tensor decompositions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5259595)