Chaining meets chain rule: multilevel entropic regularization and training of neural networks
From MaRDI portal
Publication:4969253
Authors: Amir R. Asadi, Emmanuel Abbe
Publication date: 5 October 2020
Full work available at URL: https://arxiv.org/abs/1906.11148
Recommendations
neural networkschaining mutual informationmultilevel relative entropymultiscale generalization boundmultiscale Gibbs distribution
Cites Work
- Title not available (Why is that?)
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- High-dimensional probability. An introduction with applications in data science
- Title not available (Why is that?)
- Probability and stochastics.
- Upper and lower bounds for stochastic processes. Modern methods and classical problems
- On prediction of individual sequences
- Rényi Divergence and Kullback-Leibler Divergence
- Fifty years of Shannon theory
- Sparse estimation by exponential weighting
- From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion
- On the properties of variational approximations of Gibbs posteriors
- Information-theoretic upper and lower bounds for statistical estimation
- Title not available (Why is that?)
- Learners that use little information
Cited In (1)
This page was built for publication: Chaining meets chain rule: multilevel entropic regularization and training of neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4969253)