Learning polynomial transformations via generalized tensor decompositions
From MaRDI portal
Publication:6499331
Cites work
- scientific article; zbMATH DE number 7626743 (Why is no real title available?)
- A Decomposition for Three-Way Arrays
- A probabilistic analysis of EM for mixtures of separated, spherical Gaussians
- A spectral algorithm for learning mixture models
- An Introduction to Variational Autoencoders
- Approximating discrete probability distributions with dependence trees
- Black box approximation of tensors in hierarchical Tucker format
- Candidate one-way functions based on expander graphs
- Clustering subgaussian mixtures by semidefinite programming
- Complete Dictionary Recovery Over the Sphere I: Overview and the Geometric Picture
- Cryptographic hardness of random local functions. Survey
- Cryptography in $NC^0$
- Cubic forms in Gaussian variables
- Decomposing overcomplete 3rd order tensors using sum-of-squares algorithms
- Dictionary learning and tensor decomposition via the sum-of-squares method
- Efficient construction of tensor ring representations from sampling
- Efficiently learning Ising models on arbitrary graphs (extended abstract)
- Efficiently learning mixtures of two Gaussians
- Equations for secant varieties of Veronese and other varieties
- Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
- Fast structured matrix computations: tensor rank and Cohn-Umans method
- Hierarchical Singular Value Decomposition of Tensors
- Independent component analysis by general nonlinear Hebbian-like learning rules
- Independent component analysis, a new concept?
- Learning Mixtures of Product Distributions over Discrete Domains
- Learning Theory
- Learning a tree-structured Ising model in order to make predictions
- Learning loosely connected Markov random fields
- Learning mixtures of Gaussians in high dimensions
- Learning mixtures of separated nonspherical Gaussians
- Learning mixtures of spherical Gaussians: moment methods and spectral decompositions (extended abstract)
- List-decodable robust mean estimation and learning mixtures of spherical Gaussians
- Low dimensional manifold model for image processing
- Mean-field approximation, convex hierarchies, and the optimality of correlation rounding: a unified perspective
- Mixture models, robustness, and sum of squares proofs
- Near-Optimal Learning of Tree-Structured Distributions by Chow and Liu
- On the Best Rank-1 and Rank-(R1 ,R2 ,. . .,RN) Approximation of Higher-Order Tensors
- Optimal High-Order Tensor SVD via Tensor-Train Orthogonal Iteration
- Polynomial Learning of Distribution Families
- Provable ICA with unknown Gaussian noise, and implications for Gaussian mixtures and autoencoders
- Provable learning of noisy-OR networks
- Reconstruction of Markov Random Fields from Samples: Some Observations and Algorithms
- Robust moment estimation and improved clustering via sum of squares
- Sample-optimal and efficient learning of tree Ising models
- Smoothed analysis for tensor methods in unsupervised learning
- TT-cross approximation for multidimensional arrays
- Tensor SVD: Statistical and Computational Limits
- Tensor decompositions for learning latent variable models
- Tensor ring decomposition: optimization landscape and one-loop convergence of alternating least squares
- Tensor-train decomposition
- The Average-Case Complexity of Counting Cliques in Erdös--Rényi Hypergraphs
- The minimax learning rates of normal and Ising undirected graphical models
- The sample complexity of learning fixed-structure Bayesian networks
- Tight bounds for learning a mixture of two Gaussians (extended abstract)
This page was built for publication: Learning polynomial transformations via generalized tensor decompositions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6499331)