PAC-learning gains of Turing machines over circuits and neural networks
From MaRDI portal
Publication:2111729
Recommendations
- Efficient learning algorithms yield circuit lower bounds
- Efficient Learning Algorithms Yield Circuit Lower Bounds
- scientific article; zbMATH DE number 503283
- Computational Sample Complexity
- Proof of the theory-to-practice gap in deep learning via sampling complexity bounds for neural network approximation spaces
Cites work
- scientific article; zbMATH DE number 1250549 (Why is no real title available?)
- scientific article; zbMATH DE number 1929951 (Why is no real title available?)
- A formal theory of inductive inference. Part I
- A formal theory of inductive inference. Part II
- An introduction to Kolmogorov complexity and its applications
- Boolean function complexity. Advances and frontiers.
- Circuit-size lower bounds and non-reducibility to sparse sets
- Computational Complexity
- Decision trees do not generalize to new variations
- Occam's razor
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Relations Among Complexity Measures
- Reviewing bounds on the circuit size of the hardest functions
- Some Results on Average-Case Hardness Within the Polynomial Hierarchy
- The network complexity and the Turing machine complexity of finite functions
- Two-Tape Simulation of Multitape Turing Machines
- Understanding machine learning. From theory to algorithms
This page was built for publication: PAC-learning gains of Turing machines over circuits and neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2111729)