Deep stable neural networks: large-width asymptotics and convergence rates
DOI10.3150/22-BEJ1553arXiv2108.02316MaRDI QIDQ6103259FDOQ6103259
Authors: Stefano Favaro, S. Fortini, Stefano Peluchetti
Publication date: 2 June 2023
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2108.02316
Recommendations
Bayesian inferencespectral measureGaussian stochastic processdeep neural networkexchangeable sequencedepth limitneural tangent kernelstable stochastic processinfinitely wide limitsup-norm convergence rate
Bayesian inference (62F15) Gaussian processes (60G15) Artificial neural networks and deep learning (68T07) Stable stochastic processes (60G52)
Cites Work
- Gaussian processes for machine learning.
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Approximation of multidimensional stable densities
- Merging of Opinions with Increasing Information
- Inequalities for the $r$th Absolute Moment of a Sum of Random Variables, $1 \leqq r \leqq 2$
- Metrics for multivariate stable distributions
- Bayesian learning for neural networks
- Central Limit Theorems for Interchangeable Processes
- Wide neural networks of any depth evolve as linear models under gradient descent *
- Bayesian neural network priors for edge-preserving inversion
Cited In (5)
- Continuous limits of residual neural networks in case of large input data
- Gaussian random field approximation via Stein's method with applications to wide random neural networks
- On the existence of stable equilibrium points for CNNs
- -Stable convergence of heavy-/light-tailed infinitely wide neural networks
- On the S-instability and degeneracy of discrete deep learning models
This page was built for publication: Deep stable neural networks: large-width asymptotics and convergence rates
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6103259)