On the complexity of loading shallow neural networks
From MaRDI portal
Recommendations
- scientific article; zbMATH DE number 774006
- On the infeasibility of training neural networks with small mean-squared error
- Loading Deep Networks Is Hard: The Pyramidal Case
- The computational intractability of training sigmoidal neural networks
- Complexity of shallow networks representing finite mappings
Cites work
Cited in
(15)- Stable recovery of entangled weights: towards robust identification of deep neural networks from minimal samples
- On the complexity of optimization problems for 3-dimensional convex polyhedra and decision trees
- On learning a union of half spaces
- On minimal representations of shallow ReLU networks
- Loading Deep Networks Is Hard: The Pyramidal Case
- On the complexity of approximating and illuminating three-dimensional convex polyhedra
- Training a Single Sigmoidal Neuron Is Hard
- Information theory and recovery algorithms for data fusion in Earth observation
- scientific article; zbMATH DE number 774006 (Why is no real title available?)
- Learning from hints in neural networks
- A review of combinatorial problems arising in feedforward neural network design
- Complexity of network training for classes of Neural Networks
- Wrappers for feature subset selection
- scientific article; zbMATH DE number 910890 (Why is no real title available?)
- Neural networks and complexity theory
This page was built for publication: On the complexity of loading shallow neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1105389)