Pages that link to "Item:Q5073211"
From MaRDI portal
The following pages link to Theoretical issues in deep networks (Q5073211):
Displaying 13 items.
- An analytic layer-wise deep learning framework with applications to robotics (Q2059394) (← links)
- Estimation of a regression function on a manifold by fully connected deep neural networks (Q2676904) (← links)
- On the influence of over-parameterization in manifold based surrogates and deep neural operators (Q2687573) (← links)
- Locality defeats the curse of dimensionality in convolutional teacher–student scenarios* (Q5055428) (← links)
- The unreasonable effectiveness of deep learning in artificial intelligence (Q5073209) (← links)
- Theoretical issues in deep networks (Q5073211) (← links)
- The generalized extreme learning machines: tuning hyperparameters and limiting approach for the Moore-Penrose generalized inverse (Q6055151) (← links)
- Theory of deep convolutional neural networks. III: Approximating radial functions (Q6055154) (← links)
- Towards understanding theoretical advantages of complex-reaction networks (Q6077000) (← links)
- Convergence rates for shallow neural networks learned by gradient descent (Q6137712) (← links)
- Deep networks for system identification: a survey (Q6659190) (← links)
- Kolmogorov-Arnold-informed neural network: a physics-informed deep learning framework for solving forward and inverse problems based on Kolmogorov-Arnold networks (Q6669014) (← links)
- Dissecting a small artificial neural network (Q6670409) (← links)