-Stable convergence of heavy-/light-tailed infinitely wide neural networks
From MaRDI portal
Publication:6198073
DOI10.1017/APR.2023.3arXiv2106.11064MaRDI QIDQ6198073
No author found.
Publication date: 20 February 2024
Published in: Advances in Applied Probability (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2106.11064
Central limit and other weak theorems (60F05) Artificial neural networks and deep learning (68T07) Stable stochastic processes (60G52) Neural nets and related approaches to inference from stochastic processes (62M45)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bayesian learning for neural networks
- Quantile regression neural networks: a Bayesian approach
- Random Measures, Theory and Applications
- On certain self-decomposable distributions
- Probability
- The Principles of Deep Learning Theory
- On the behaviour of the characteristic function of a probability distribution in the neighbourhood of the origin
- A representation theorem for symmetric stable processes and stable measures on H
- Probabilistic Symmetries and Invariance Principles
- Fundamentals of Nonparametric Bayesian Inference
- Priors in Bayesian Deep Learning: A Review
- Neural tangent kernel: convergence and generalization in neural networks (invited paper)
- Deep stable neural networks: large-width asymptotics and convergence rates
This page was built for publication: -Stable convergence of heavy-/light-tailed infinitely wide neural networks