PAC-learning gains of Turing machines over circuits and neural networks
From MaRDI portal
Publication:2111729
DOI10.1016/J.PHYSD.2022.133585OpenAlexW3137745286MaRDI QIDQ2111729FDOQ2111729
Authors: Brieuc Pinon, Raphaël M. Jungers, Jean-Charles Delvenne
Publication date: 17 January 2023
Published in: Physica D (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2103.12686
computational complexitydeep learningKolmogorov complexityminimum description lengthPAC-learningprogram induction
Cites Work
- Understanding Machine Learning
- Computational Complexity
- A formal theory of inductive inference. Part I
- Relations Among Complexity Measures
- A formal theory of inductive inference. Part II
- Boolean function complexity. Advances and frontiers.
- Title not available (Why is that?)
- Occam's razor
- Title not available (Why is that?)
- Circuit-size lower bounds and non-reducibility to sparse sets
- Two-Tape Simulation of Multitape Turing Machines
- The network complexity and the Turing machine complexity of finite functions
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Some Results on Average-Case Hardness Within the Polynomial Hierarchy
- Reviewing bounds on the circuit size of the hardest functions
- An introduction to Kolmogorov complexity and its applications
- Decision trees do not generalize to new variations
This page was built for publication: PAC-learning gains of Turing machines over circuits and neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2111729)