On the convergence of formally diverging neural net-based classifiers
From MaRDI portal
Publication:1747388
DOI10.1016/j.crma.2018.03.003zbMath1390.68516OpenAlexW2793287646MaRDI QIDQ1747388
Pierre-Emmanuel Jabin, Leonid Berlyand
Publication date: 8 May 2018
Published in: Comptes Rendus. Mathématique. Académie des Sciences, Paris (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.crma.2018.03.003
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
A novel multi-scale loss function for classification problems in machine learning, Stability for the training of deep neural networks and other classifiers
Uses Software
Cites Work
- Unnamed Item
- Subgeometric rates of convergence in Wasserstein distance for Markov chains
- Global-in-time weak measure solutions and finite-time aggregation for nonlocal interaction equations
- Asymptotic coupling and a general form of Harris' theorem with applications to stochastic delay equations
- Practical drift conditions for subgeometric rates of convergence.
- On asymptotics for Vaserstein coupling of Markov chains
- Subgeometric rates of convergence of Markov processes in the Wasserstein metric
- General relative entropy inequality: an illustration on growth models
- Group Invariant Scattering
- Markov Chains and Stochastic Stability
- Deep Haar scattering networks
- Lipschitz properties for deep convolutional networks