Polynomial‐time universality and limitations of deep learning
From MaRDI portal
Publication:6074573
DOI10.1002/CPA.22121MaRDI QIDQ6074573FDOQ6074573
Authors: Emmanuel Abbe, Colin Sandon
Publication date: 12 October 2023
Published in: Communications on Pure and Applied Mathematics (Search for Journal in Brave)
Cites Work
- Deep learning
- Understanding machine learning. From theory to algorithms
- Title not available (Why is that?)
- Title not available (Why is that?)
- Foundations of machine learning
- Efficient noise-tolerant learning from statistical queries
- Noise-tolerant learning, the parity problem, and the statistical query model
- Title not available (Why is that?)
- Weakly learning DNF and characterizing statistical query learning using Fourier analysis
- New lower bounds for statistical query learning
- Characterizing statistical query learning: simplified notions and proofs
- Cryptographic hardness for learning intersections of halfspaces
- Foundations of software technology and theoretical computer science. 16th conference, Hyderabad, India, December 18--20, 1996. Proceedings
- Gradient descent only converges to minimizers: non-isolated critical points and invariant regions
- Distribution-specific hardness of learning neural networks
- Statistical algorithms and a lower bound for detecting planted cliques
- On the complexity of random satisfiability problems with planted solutions
- Statistical Query Algorithms for Mean Vector Estimation and Stochastic Convex Optimization
Cited In (1)
This page was built for publication: Polynomial‐time universality and limitations of deep learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6074573)