Memory Capacity of Neural Networks with Threshold and Rectified Linear Unit Activations
DOI10.1137/20M1314884OpenAlexW3093643472MaRDI QIDQ5037553FDOQ5037553
Authors: Roman Vershynin
Publication date: 1 March 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/20m1314884
Recommendations
- On the capacity of associative memories with linear threshold functions
- The capacity of feedforward neural networks
- scientific article; zbMATH DE number 2202162
- Long term memory storage capacity of multiconnected neural networks
- Neural networks with memory
- Storage capacity of a neural network with state-dependent synapses
- On neural network kernels and the storage capacity problem
Learning and adaptive systems in artificial intelligence (68T05) Artificial neural networks and deep learning (68T07) Neural networks for/in biological studies, artificial life and related topics (92B20)
Cites Work
- High-dimensional probability. An introduction with applications in data science
- Enumeration of Seven-Argument Threshold Functions
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the capabilities of multilayer perceptrons
- Bounds on the learning capacity of some multi-layer networks
- The capacity of feedforward neural networks
- Gradient descent optimizes over-parameterized deep ReLU networks
Cited In (11)
- Stable recovery of entangled weights: towards robust identification of deep neural networks from minimal samples
- The interpolation phase transition in neural networks: memorization and generalization under lazy training
- Long term memory storage capacity of multiconnected neural networks
- Just least squares: binary compressive sampling with low generative intrinsic dimension
- Title not available (Why is that?)
- The capacity of feedforward neural networks
- Information theory and recovery algorithms for data fusion in Earth observation
- Designing universal causal deep learning models: The geometric (Hyper)transformer
- Expressive power of ReLU and step networks under floating-point operations
- Approximation in shift-invariant spaces with deep ReLU neural networks
- Overparameterized neural networks implement associative memory
This page was built for publication: Memory Capacity of Neural Networks with Threshold and Rectified Linear Unit Activations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5037553)