Fourier PCA and robust tensor decomposition
From MaRDI portal
Publication:5259594
Abstract: Fourier PCA is Principal Component Analysis of a matrix obtained from higher order derivatives of the logarithm of the Fourier transform of a distribution.We make this method algorithmic by developing a tensor decomposition method for a pair of tensors sharing the same vectors in rank- decompositions. Our main application is the first provably polynomial-time algorithm for underdetermined ICA, i.e., learning an matrix from observations where is drawn from an unknown product distribution with arbitrary non-Gaussian components. The number of component distributions can be arbitrarily higher than the dimension and the columns of only need to satisfy a natural and efficiently verifiable nondegeneracy condition. As a second application, we give an alternative algorithm for learning mixtures of spherical Gaussians with linearly independent means. These results also hold in the presence of Gaussian noise.
Recommendations
- Provable ICA with unknown Gaussian noise, and implications for Gaussian mixtures and autoencoders
- Overcomplete order-3 tensor decomposition, blind deconvolution, and Gaussian mixture models
- Spectral independent component analysis
- Independent Component Analysis and Blind Signal Separation
- scientific article; zbMATH DE number 1843075
Cites work
- scientific article; zbMATH DE number 5485440 (Why is no real title available?)
- scientific article; zbMATH DE number 5485574 (Why is no real title available?)
- Advances in Cryptology – CRYPTO 2004
- Answering \(n^{2+o(1)}\) counting queries with differential privacy is hard
- Bounds on the sample complexity for private learning and private data release
- Characterizing the sample complexity of private learners
- Collusion-secure fingerprinting for digital data
- Differential privacy and the fat-shattering dimension of linear queries
- Efficient algorithms for privately releasing marginals via convex relaxations
- Faster algorithms for privately releasing marginals
- Faster private release of marginals on small databases
- Interactive privacy via the median mechanism
- Iterative Constructions and Private Data Release
- Lower bounds in differential privacy
- New Efficient Attacks on Statistical Disclosure Control Mechanisms
- On the complexity of differentially private data release, efficient algorithms and hardness results
- On the geometry of differential privacy
- Our Data, Ourselves: Privacy Via Distributed Noise Generation
- Private Learning and Sanitization: Pure vs. Approximate Differential Privacy
- The price of privately releasing contingency tables and the spectra of random matrices with correlated rows
- Theory of Cryptography
Cited in
(15)- Eigenvectors of Orthogonally Decomposable Functions
- On the optimization landscape of tensor decompositions
- The sample complexity of multireference alignment
- Overcomplete order-3 tensor decomposition, blind deconvolution, and Gaussian mixture models
- Estimation under group actions: recovering orbits from invariants
- Smoothed analysis for tensor methods in unsupervised learning
- Robustness of the Inner–Outer Factorization and of the Spectral Factorization for FIR Data
- Multi-reference alignment in high dimensions: sample complexity and phase transition
- Robust dimension reduction, fusion frames, and Grassmannian packings
- Estimating Higher-Order Moments Using Symmetric Tensor Decomposition
- Provable ICA with unknown Gaussian noise, and implications for Gaussian mixtures and autoencoders
- Complete decomposition of symmetric tensors in linear time and polylogarithmic precision
- Kronecker-decomposable robust probabilistic tensor discriminant analysis
- Parallel active subspace decomposition for tensor robust principal component analysis
- Average-case complexity of tensor decomposition for low-degree polynomials
This page was built for publication: Fourier PCA and robust tensor decomposition
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5259594)