Polynomial‐time universality and limitations of deep learning
From MaRDI portal
Publication:6074573
Cites work
- scientific article; zbMATH DE number 5595162 (Why is no real title available?)
- scientific article; zbMATH DE number 872005 (Why is no real title available?)
- scientific article; zbMATH DE number 3314813 (Why is no real title available?)
- Characterizing statistical query learning: simplified notions and proofs
- Cryptographic hardness for learning intersections of halfspaces
- Deep learning
- Distribution-specific hardness of learning neural networks
- Efficient noise-tolerant learning from statistical queries
- Foundations of machine learning
- Foundations of software technology and theoretical computer science. 16th conference, Hyderabad, India, December 18--20, 1996. Proceedings
- Gradient descent only converges to minimizers: non-isolated critical points and invariant regions
- New lower bounds for statistical query learning
- Noise-tolerant learning, the parity problem, and the statistical query model
- On the complexity of random satisfiability problems with planted solutions
- Statistical algorithms and a lower bound for detecting planted cliques
- Understanding machine learning. From theory to algorithms
- Weakly learning DNF and characterizing statistical query learning using Fourier analysis
This page was built for publication: Polynomial‐time universality and limitations of deep learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6074573)