Archetypal landscapes for deep neural networks
From MaRDI portal
Publication:5073145
Recommendations
Cites work
- scientific article; zbMATH DE number 5621256 (Why is no real title available?)
- A Family of Variable-Metric Methods Derived by Variational Means
- A general reinforcement learning algorithm that masters chess, shogi, and Go through self-play
- A mean field view of the landscape of two-layer neural networks
- A new approach to variable metric algorithms
- Cascading classifiers
- Comparing dynamics: deep neural networks versus glassy systems
- Conditioning of Quasi-Newton Methods for Function Minimization
- Entropy-SGD: biasing gradient descent into wide valleys
- Flat Minima
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
Cited in
(11)- Revisiting Landscape Analysis in Deep Neural Networks: Eliminating Decreasing Paths to Infinity
- Community detection-based deep neural network architectures: a fully automated framework based on Likert-scale data
- Shaping the learning landscape in neural networks around wide flat minima
- Comparing dynamics: deep neural networks versus glassy systems
- Spurious valleys in one-hidden-layer neural network optimization landscapes
- Black holes and the loss landscape in machine learning
- Towards interpreting deep neural networks via layer behavior understanding
- Optimization Landscape of Neural Networks
- Landscape and training regimes in deep learning
- Barcodes as summary of loss function topology
- Dissecting a small artificial neural network
This page was built for publication: Archetypal landscapes for deep neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5073145)