Learning polynomial transformations via generalized tensor decompositions
From MaRDI portal
Publication:6499331
DOI10.1145/3564246.3585209MaRDI QIDQ6499331FDOQ6499331
Jerry Li, Anru R. Zhang, Sitan Chen, Yuanzhi Li
Publication date: 8 May 2024
unsupervised learningsum-of-squaresgenerative modelsmatrix product statestensor ring decompositionpushforwards
Cites Work
- Independent component analysis, a new concept?
- Tensor SVD: Statistical and Computational Limits
- Approximating discrete probability distributions with dependence trees
- Tensor-Train Decomposition
- TT-cross approximation for multidimensional arrays
- Hierarchical Singular Value Decomposition of Tensors
- On the Best Rank-1 and Rank-(R1 ,R2 ,. . .,RN) Approximation of Higher-Order Tensors
- Learning Mixtures of Product Distributions over Discrete Domains
- Independent component analysis by general nonlinear Hebbian-like learning rules
- Tensor decompositions for learning latent variable models
- Learning mixtures of separated nonspherical Gaussians
- Efficiently learning mixtures of two Gaussians
- Learning Theory
- Candidate One-Way Functions Based on Expander Graphs
- Cryptography in $NC^0$
- Equations for secant varieties of Veronese and other varieties
- Cryptographic hardness of random local functions. Survey
- A Decomposition for Three-Way Arrays
- Reconstruction of Markov Random Fields from Samples: Some Observations and Algorithms
- Title not available (Why is that?)
- A spectral algorithm for learning mixture models
- Black box approximation of tensors in hierarchical Tucker format
- Complete Dictionary Recovery Over the Sphere I: Overview and the Geometric Picture
- Fast structured matrix computations: tensor rank and Cohn-Umans method
- Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
- Learning Mixtures of Gaussians in High Dimensions
- The sample complexity of learning fixed-structure Bayesian networks
- Cubic forms in Gaussian variables
- Provable ICA with unknown Gaussian noise, and implications for Gaussian mixtures and autoencoders
- Provable learning of noisy-OR networks
- Learning mixtures of spherical gaussians
- Low Dimensional Manifold Model for Image Processing
- Efficiently Learning Ising Models on Arbitrary Graphs
- Dictionary Learning and Tensor Decomposition via the Sum-of-Squares Method
- Clustering subgaussian mixtures by semidefinite programming
- Decomposing Overcomplete 3rd Order Tensors using Sum-of-Squares Algorithms
- Polynomial Learning of Distribution Families
- Title not available (Why is that?)
- An Introduction to Variational Autoencoders
- The minimax learning rates of normal and Ising undirected graphical models
- Robust moment estimation and improved clustering via sum of squares
- Tight Bounds for Learning a Mixture of Two Gaussians
- List-decodable robust mean estimation and learning mixtures of spherical gaussians
- Title not available (Why is that?)
- Optimal High-Order Tensor SVD via Tensor-Train Orthogonal Iteration
- Smoothed analysis for tensor methods in unsupervised learning
- Mean-field approximation, convex hierarchies, and the optimality of correlation rounding: a unified perspective
- Efficient Construction of Tensor Ring Representations from Sampling
- Learning a tree-structured Ising model in order to make predictions
- Mixture models, robustness, and sum of squares proofs
- The Average-Case Complexity of Counting Cliques in Erdös--Rényi Hypergraphs
- Tensor Ring Decomposition: Optimization Landscape and One-loop Convergence of Alternating Least Squares
- Sample-optimal and efficient learning of tree Ising models
- Near-Optimal Learning of Tree-Structured Distributions by Chow and Liu
Cited In (1)
This page was built for publication: Learning polynomial transformations via generalized tensor decompositions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6499331)