Entropy-SGD
From MaRDI portal
Software:1354032
swMATH41231MaRDI QIDQ1354032FDOQ1354032
Author name not available (Why is that?)
Source code repository: https://github.com/ucla-vision/entropy-sgd
Cited In (20)
- A spin glass model for the loss surfaces of generative adversarial networks
- Comparing dynamics: deep neural networks versus glassy systems
- Selection dynamics for deep neural networks
- Forward stability of ResNet and its variants
- Stochastic backward Euler: an implicit gradient descent algorithm for \(k\)-means clustering
- Title not available (Why is that?)
- Ensemble Kalman inversion: a derivative-free technique for machine learning tasks
- Deep relaxation: partial differential equations for optimizing deep neural networks
- Building a telescope to look into high-dimensional image spaces
- Chaos and complexity from quantum neural network. A study with diffusion metric in machine learning
- Entropic gradient descent algorithms and wide flat minima*
- Bias of homotopic gradient descent for the hinge loss
- Dynamics of stochastic gradient descent for two-layer neural networks in the teacher–student setup*
- On Bayesian posterior mean estimators in imaging sciences and Hamilton-Jacobi partial differential equations
- Interpretable machine learning: fundamental principles and 10 grand challenges
- The committee machine: computational to statistical gaps in learning a two-layers neural network
- Global Minima of Overparameterized Neural Networks
- Wasserstein-Based Projections with Applications to Inverse Problems
- Optimization for deep learning: an overview
- Run-and-inspect method for nonconvex optimization and global optimality bounds for R-local minimizers
This page was built for software: Entropy-SGD