Memory Capacity of Neural Networks with Threshold and Rectified Linear Unit Activations
From MaRDI portal
Publication:5037553
DOI10.1137/20M1314884OpenAlexW3093643472MaRDI QIDQ5037553
Publication date: 1 March 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/20m1314884
Artificial neural networks and deep learning (68T07) Learning and adaptive systems in artificial intelligence (68T05) Neural networks for/in biological studies, artificial life and related topics (92B20)
Related Items (7)
Unnamed Item ⋮ Just least squares: binary compressive sampling with low generative intrinsic dimension ⋮ Designing universal causal deep learning models: The geometric (Hyper)transformer ⋮ Approximation in shift-invariant spaces with deep ReLU neural networks ⋮ Stable recovery of entangled weights: towards robust identification of deep neural networks from minimal samples ⋮ The interpolation phase transition in neural networks: memorization and generalization under lazy training ⋮ Information theory and recovery algorithms for data fusion in Earth observation
Cites Work
- On the capabilities of multilayer perceptrons
- Bounds on the learning capacity of some multi-layer networks
- Gradient descent optimizes over-parameterized deep ReLU networks
- The capacity of feedforward neural networks
- High-Dimensional Probability
- Enumeration of Seven-Argument Threshold Functions
- Unnamed Item
- Unnamed Item
This page was built for publication: Memory Capacity of Neural Networks with Threshold and Rectified Linear Unit Activations