Archetypal landscapes for deep neural networks
From MaRDI portal
Publication:5073145
Recommendations
Cites work
- scientific article; zbMATH DE number 5621256 (Why is no real title available?)
- A Family of Variable-Metric Methods Derived by Variational Means
- A general reinforcement learning algorithm that masters chess, shogi, and Go through self-play
- A mean field view of the landscape of two-layer neural networks
- A new approach to variable metric algorithms
- Cascading classifiers
- Comparing dynamics: deep neural networks versus glassy systems
- Conditioning of Quasi-Newton Methods for Function Minimization
- Entropy-SGD: biasing gradient descent into wide valleys
- Flat Minima
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
Cited in
(11)- Comparing dynamics: deep neural networks versus glassy systems
- Black holes and the loss landscape in machine learning
- Spurious valleys in one-hidden-layer neural network optimization landscapes
- Shaping the learning landscape in neural networks around wide flat minima
- Landscape and training regimes in deep learning
- Community detection-based deep neural network architectures: a fully automated framework based on Likert-scale data
- Optimization Landscape of Neural Networks
- Revisiting Landscape Analysis in Deep Neural Networks: Eliminating Decreasing Paths to Infinity
- Towards interpreting deep neural networks via layer behavior understanding
- Dissecting a small artificial neural network
- Barcodes as summary of loss function topology
This page was built for publication: Archetypal landscapes for deep neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5073145)