Deep stable neural networks: large-width asymptotics and convergence rates
From MaRDI portal
Publication:6103259
DOI10.3150/22-bej1553arXiv2108.02316MaRDI QIDQ6103259
Stefano Favaro, Sandra Fortini, Stefano Peluchetti
Publication date: 2 June 2023
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2108.02316
spectral measureBayesian inferenceGaussian stochastic processdeep neural networkexchangeable sequencedepth limitneural tangent kernelstable stochastic processinfinitely wide limitsup-norm convergence rate
Gaussian processes (60G15) Artificial neural networks and deep learning (68T07) Bayesian inference (62F15) Stable stochastic processes (60G52)
Related Items
Cites Work
- Approximation of multidimensional stable densities
- Bayesian learning for neural networks
- Bayesian neural network priors for edge-preserving inversion
- Metrics for multivariate stable distributions
- Central Limit Theorems for Interchangeable Processes
- Merging of Opinions with Increasing Information
- Inequalities for the $r$th Absolute Moment of a Sum of Random Variables, $1 \leqq r \leqq 2$
- Wide neural networks of any depth evolve as linear models under gradient descent *
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item