Prevalence of neural collapse during the terminal phase of deep learning training
From MaRDI portal
Publication:5073172
DOI10.1073/pnas.2015509117zbMath1489.68237arXiv2008.08186OpenAlexW3065974826WikidataQ99594088 ScholiaQ99594088MaRDI QIDQ5073172
No author found.
Publication date: 5 May 2022
Published in: Proceedings of the National Academy of Sciences (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2008.08186
Related Items (3)
Neural collapse under cross-entropy loss ⋮ Neural collapse with unconstrained features ⋮ Deep regularization and direct training of the inner layers of neural networks with kernel flows
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Grassmannian frames with applications to coding and communication
- Adversarial noise attacks of deep learning architectures: stability analysis via sparse-modeled signals
- Group Invariant Scattering
- A Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction
- Multi-Layer Sparse Coding: The Holistic Way
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
This page was built for publication: Prevalence of neural collapse during the terminal phase of deep learning training