A theory of capacity and sparse neural encoding
From MaRDI portal
Publication:6079092
Abstract: Motivated by biological considerations, we study sparse neural maps from an input layer to a target layer with sparse activity, and specifically the problem of storing input-target associations , or memories, when the target vectors are sparse. We mathematically prove that undergoes a phase transition and that in general, and somewhat paradoxically, sparsity in the target layers increases the storage capacity of the map. The target vectors can be chosen arbitrarily, including in random fashion, and the memories can be both encoded and decoded by networks trained using local learning rules, including the simple Hebb rule. These results are robust under a variety of statistical assumptions on the data. The proofs rely on elegant properties of random polytopes and sub-gaussian random vector variables. Open problems and connections to capacity theories and polynomial threshold maps are discussed.
Recommendations
Cites work
- scientific article; zbMATH DE number 516161 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 3189712 (Why is no real title available?)
- 10.1162/15324430152748236
- A mathematical introduction to compressive sensing
- Asymptotic geometric analysis. I
- Asymptotic shape of a random polytope in a convex body
- Banach-Mazur distances and projections on random subgaussian polytopes
- Bayesian Variable Selection in Linear Regression
- Central limit theorems for Gaussian polytopes
- Compressed sensing
- Compressive sampling
- Cones generated by random points on half-spheres and convex hulls of Poisson point processes
- Counting faces of randomly projected polytopes when the projection radically lowers dimension
- Counting the faces of randomly-projected hypercubes and orthants, with applications
- Deep learning in science
- Dimension reduction by random hyperplane tessellations
- EXTREMAL PROPERTIES OF ORTHOGONAL PARALLELEPIPEDS AND THEIR APPLICATIONS TO THE GEOMETRY OF BANACH SPACES
- Expected intrinsic volumes and facet numbers of random beta-polytopes
- Gaussian polytopes: a cumulant-based approach
- Gaussian polytopes: variances and limit theorems
- High-dimensional probability. An introduction with applications in data science
- Linear Inversion of Band-Limited Reflection Seismograms
- Living on the edge: phase transitions in convex programs with random data
- Neural networks and physical systems with emergent collective computational abilities
- One-bit compressed sensing with non-Gaussian measurements
- Polynomial threshold functions, hyperplane arrangements, and random tensors
- Probability
- Random polytopes
- Random projections of regular polytopes
- Random projections of regular simplices
- Random spaces generated by vertices of the cube
- Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
- Smallest singular value of random matrices and geometry of random polytopes
- Support-vector networks
- The Generalized Lasso With Non-Linear Observations
- The capacity of feedforward neural networks
- The geometry of random \(\{-1,1\}\)-polytopes
- The horseshoe estimator for sparse signals
- Universality in polytope phase transitions and message passing algorithms
Cited in
(7)- Expansion of information in the binary autoencoder with random binary weights
- Information theoretic limits of learning a sparse rule
- Tractability from overparametrization: the example of the negative perceptron
- Sparse coding for layered neural networks
- Lah distribution: Stirling numbers, records on compositions, and convex hulls of high-dimensional random walks
- What intraclass covariance structures can symmetric Bernoulli random variables have?
- Lower bounds on the capacities of binary and ternary networks storing sparse random vectors
This page was built for publication: A theory of capacity and sparse neural encoding
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6079092)