Pages that link to "Item:Q5219306"
From MaRDI portal
The following pages link to Mean Field Analysis of Neural Networks: A Law of Large Numbers (Q5219306):
Displaying 35 items.
- A selective overview of deep learning (Q2038303) (← links)
- Reinforcement learning and stochastic optimisation (Q2072112) (← links)
- Normalization effects on shallow neural networks and related asymptotic expansions (Q2072629) (← links)
- Mean-field Langevin dynamics and energy landscape of neural networks (Q2077356) (← links)
- Supervised learning from noisy observations: combining machine-learning techniques with data assimilation (Q2077682) (← links)
- Propagation of chaos: a review of models, methods and applications. I: Models and methods (Q2088752) (← links)
- Propagation of chaos: a review of models, methods and applications. II: Applications (Q2088753) (← links)
- Asymptotic properties of one-layer artificial neural networks with sparse connectivity (Q2105365) (← links)
- Representation formulas and pointwise properties for Barron functions (Q2113295) (← links)
- Nonlocal cross-diffusion systems for multi-species populations and networks (Q2119411) (← links)
- Surprises in high-dimensional ridgeless least squares interpolation (Q2131262) (← links)
- Sparse optimization on measures with over-parameterized gradient descent (Q2149558) (← links)
- Landscape and training regimes in deep learning (Q2231925) (← links)
- (Q5011561) (← links)
- (Q5054655) (← links)
- Two-Layer Neural Networks with Values in a Banach Space (Q5055293) (← links)
- Mean Field Analysis of Deep Neural Networks (Q5076694) (← links)
- Asymptotics of Reinforcement Learning with Neural Networks (Q5084496) (← links)
- Large Sample Mean-Field Stochastic Optimization (Q5097396) (← links)
- Mean Field Limits for Interacting Diffusions with Colored Noise: Phase Transitions and Spectral Numerical Methods (Q5150070) (← links)
- Fast Non-mean-field Networks: Uniform in Time Averaging (Q5150325) (← links)
- Suboptimal Local Minima Exist for Wide Neural Networks with Smooth Activations (Q5870356) (← links)
- Mehler’s Formula, Branching Process, and Compositional Kernels of Deep Neural Networks (Q5881138) (← links)
- The Continuous Formulation of Shallow Neural Networks as Wasserstein-Type Gradient Flows (Q5886422) (← links)
- Deep learning: a statistical viewpoint (Q5887827) (← links)
- A rigorous framework for the mean field limit of multilayer neural networks (Q6062704) (← links)
- A class of dimension-free metrics for the convergence of empirical measures (Q6072907) (← links)
- Sharp uniform-in-time propagation of chaos (Q6095842) (← links)
- Continuous limits of residual neural networks in case of large input data (Q6098879) (← links)
- Online parameter estimation for the McKean-Vlasov stochastic differential equation (Q6115259) (← links)
- A blob method for inhomogeneous diffusion with applications to multi-agent control and sampling (Q6133439) (← links)
- Non-mean-field Vicsek-type models for collective behavior (Q6157829) (← links)
- Stochastic gradient descent with noise of machine learning type. II: Continuous time analysis (Q6188971) (← links)
- Normalization effects on deep neural networks (Q6194477) (← links)
- Gradient descent on infinitely wide neural networks: global convergence and generalization (Q6200217) (← links)