On random matrices arising in deep neural networks: General I.I.D. case
DOI10.1142/S2010326322500460zbMATH Open1517.60015arXiv2011.11439OpenAlexW3106743998WikidataQ114071582 ScholiaQ114071582MaRDI QIDQ6163573FDOQ6163573
Authors: L. A. Pastur, Victor Slavin
Publication date: 26 June 2023
Published in: Random Matrices: Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2011.11439
Recommendations
- On Random Matrices Arising in Deep Neural Networks. Gaussian Case
- Eigenvalue distribution of large random matrices arising in deep neural networks: Orthogonal case
- scientific article; zbMATH DE number 822011
- Nonlinear random matrix theory for deep learning
- Eigenvalue distribution of some nonlinear models of random matrices
Random matrices (probabilistic aspects) (60B20) Neural networks for/in biological studies, artificial life and related topics (92B20)
Cites Work
- Title not available (Why is that?)
- Spectral analysis of large dimensional random matrices
- Random matrices: universality of ESDs and the circular law
- High-dimensional probability. An introduction with applications in data science
- DISTRIBUTION OF EIGENVALUES FOR SOME SETS OF RANDOM MATRICES
- Deep learning
- Eigenvalue distribution of large random matrices
- Asymptotic spectra of matrix-valued functions of independent random matrices and free probability
- Title not available (Why is that?)
- Title not available (Why is that?)
- Analysis of the limiting spectral measure of large random matrices of the separable covariance type
- Free probability and random matrices
- Deep Neural Networks with Random Gaussian Weights: A Universal Classification Strategy?
- On the asymptotic eigenvalue distribution of concatenated vector-valued fading channels
- The loss surfaces of neural networks with general activation functions
- Deep Neural Networks in a Mathematical Framework
- The Principles of Deep Learning Theory
Cited In (13)
- Linear eigenvalue statistics of XX′ matrices
- The Law of Multiplication of Large Random Matrices Revisited
- On Random Matrices Arising in Deep Neural Networks. Gaussian Case
- Title not available (Why is that?)
- Eigenvalue distribution of large random matrices arising in deep neural networks: Orthogonal case
- Universal characteristics of deep neural network loss surfaces from random matrix theory
- Large-dimensional random matrix theory and its applications in deep learning and wireless communications
- A random matrix approach to neural networks
- Nonlinear random matrix theory for deep learning
- Products of many large random matrices and gradients in deep neural networks
- Eigenvalue distribution of some nonlinear models of random matrices
- Asymptotic freeness of layerwise Jacobians caused by invariance of multilayer perceptron: the Haar orthogonal case
- A note on the Pennington-Worah distribution
This page was built for publication: On random matrices arising in deep neural networks: General I.I.D. case
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6163573)