Size-independent sample complexity of neural networks
From MaRDI portal
Publication:5006527
DOI10.1093/imaiai/iaz007OpenAlexW2963038205WikidataQ128243149 ScholiaQ128243149MaRDI QIDQ5006527
Noah Golowich, Alexander Rakhlin, Ohad Shamir
Publication date: 16 August 2021
Published in: Information and Inference: A Journal of the IMA (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1712.06541
Related Items (18)
Deep learning: a statistical viewpoint ⋮ Benign overfitting in linear regression ⋮ Nonlinear Weighted Directed Acyclic Graph and A Priori Estimates for Neural Networks ⋮ Approximation bounds for norm constrained neural networks with applications to regression and GANs ⋮ Deep empirical risk minimization in finance: Looking into the future ⋮ Adversarial Robustness of Sparse Local Lipschitz Predictors ⋮ PAC-learning with approximate predictors ⋮ Positive-unlabeled classification under class-prior shift: a prior-invariant approach based on density ratio estimation ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ A selective overview of deep learning ⋮ Regularisation of neural networks by enforcing Lipschitz continuity ⋮ Variational Monte Carlo -- bridging concepts of machine learning and high-dimensional partial differential equations ⋮ On the Purity and Entropy of Mixed Gaussian States ⋮ Compressive sensing and neural networks from a statistical learning perspective ⋮ Learning Finite-Dimensional Coding Schemes with Nonlinear Reconstruction Maps ⋮ Robust and resource-efficient identification of two hidden layer neural networks
This page was built for publication: Size-independent sample complexity of neural networks