Neural networks with linear threshold activations: structure and algorithms
From MaRDI portal
Publication:6589753
Recommendations
Cites work
- scientific article; zbMATH DE number 3385535 (Why is no real title available?)
- Approximating threshold circuits by rational functions
- Approximation Algorithms for Training One-Node ReLU Neural Networks
- Approximation by superpositions of a sigmoidal function
- Complexity of training ReLU neural network
- Neural Network Learning
- Robust trainability of single neurons
- Size--Depth Tradeoffs for Threshold Circuits
- Super-linear gate and super-quadratic wire lower bounds for depth-two and depth-three threshold circuits
- The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality
This page was built for publication: Neural networks with linear threshold activations: structure and algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6589753)