The following pages link to (Q5381117):
Displayed 38 items.
- Learning under \((1 + \epsilon)\)-moment conditions (Q778021) (← links)
- Topological properties of the set of functions generated by neural networks of fixed size (Q2031060) (← links)
- Fast generalization error bound of deep learning without scale invariance of activation functions (Q2055056) (← links)
- On the rate of convergence of image classifiers based on convolutional neural networks (Q2087403) (← links)
- The interpolation phase transition in neural networks: memorization and generalization under lazy training (Q2105197) (← links)
- Approximation spaces of deep neural networks (Q2117336) (← links)
- Mathematical foundations of machine learning. Abstracts from the workshop held March 21--27, 2021 (hybrid meeting) (Q2131208) (← links)
- The capacity of feedforward neural networks (Q2183684) (← links)
- Estimation of a regression function on a manifold by fully connected deep neural networks (Q2676904) (← links)
- On the rate of convergence of a deep recurrent neural network estimate in a regression problem with dependent data (Q2692553) (← links)
- Memory Capacity of Neural Networks with Threshold and Rectified Linear Unit Activations (Q5037553) (← links)
- Analysis of the Generalization Error: Empirical Risk Minimization over Deep Artificial Neural Networks Overcomes the Curse of Dimensionality in the Numerical Approximation of Black--Scholes Partial Differential Equations (Q5037569) (← links)
- (Q5053236) (← links)
- Quantitative Approximation Results for Complex-Valued Neural Networks (Q5073921) (← links)
- Convergence Rate Analysis for Deep Ritz Method (Q5077692) (← links)
- A Rate of Convergence of Physics Informed Neural Networks for the Linear Second Order Elliptic PDEs (Q5077701) (← links)
- Imaging conductivity from current density magnitude using neural networks* (Q5081798) (← links)
- Stationary Density Estimation of Itô Diffusions Using Deep Learning (Q5886225) (← links)
- Sharp Bounds for the Number of Regions of Maxout Networks and Vertices of Minkowski Sums (Q5886829) (← links)
- Deep learning: a statistical viewpoint (Q5887827) (← links)
- Neural network approximation (Q5887830) (← links)
- Approximation bounds for norm constrained neural networks with applications to regression and GANs (Q6038825) (← links)
- A Deep Generative Approach to Conditional Sampling (Q6077577) (← links)
- The Kolmogorov-Arnold representation theorem revisited (Q6078698) (← links)
- Deep ReLU neural networks in high-dimensional approximation (Q6079085) (← links)
- PAC-learning with approximate predictors (Q6103580) (← links)
- Domain adversarial neural networks for domain generalization: when it works and how to improve (Q6134339) (← links)
- Convergence rates for shallow neural networks learned by gradient descent (Q6137712) (← links)
- Nonexact oracle inequalities, \(r\)-learnability, and fast rates (Q6149162) (← links)
- Adaptive novelty detection with false discovery rate guarantee (Q6151968) (← links)
- Just least squares: binary compressive sampling with low generative intrinsic dimension (Q6159304) (← links)
- Neural network approximation and estimation of classifiers with classification boundary in a Barron class (Q6165247) (← links)
- Deep nonparametric regression on approximate manifolds: nonasymptotic error bounds with polynomial prefactors (Q6172194) (← links)
- Deep Ritz method for elliptical multiple eigenvalue problems (Q6182319) (← links)
- Learning ability of interpolating deep convolutional neural networks (Q6185680) (← links)
- Designing universal causal deep learning models: The geometric (Hyper)transformer (Q6196301) (← links)
- Estimating a regression function in exponential families by model selection (Q6201869) (← links)
- Approximation in shift-invariant spaces with deep ReLU neural networks (Q6341347) (← links)