Pages that link to "Item:Q4560301"
From MaRDI portal
The following pages link to Deep distributed convolutional neural networks: Universality (Q4560301):
Displaying 50 items.
- Learning under \((1 + \epsilon)\)-moment conditions (Q778021) (← links)
- Approximation rates for neural networks with general activation functions (Q1982446) (← links)
- Distributed regularized least squares with flexible Gaussian kernels (Q2036424) (← links)
- Theory of deep convolutional neural networks. II: Spherical analysis (Q2057723) (← links)
- Rates of approximation by neural network interpolation operators (Q2073064) (← links)
- Stochastic Markov gradient descent and training low-bit neural networks (Q2073135) (← links)
- On the speed of uniform convergence in Mercer's theorem (Q2091033) (← links)
- Functional linear regression with Huber loss (Q2099272) (← links)
- Approximation of functions from korobov spaces by deep convolutional neural networks (Q2108977) (← links)
- A mesh-free method using piecewise deep neural network for elliptic interface problems (Q2141617) (← links)
- Approximation properties of deep ReLU CNNs (Q2157922) (← links)
- Rates of convergence of randomized Kaczmarz algorithms in Hilbert spaces (Q2168686) (← links)
- Learning rate of distribution regression with dependent samples (Q2171946) (← links)
- Theory of deep convolutional neural networks: downsampling (Q2185717) (← links)
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss (Q2191832) (← links)
- Approximation by multivariate max-product Kantorovich-type operators and learning rates of least-squares regularized regression (Q2191850) (← links)
- Optimal learning rates for distribution regression (Q2283125) (← links)
- Universality of deep convolutional neural networks (Q2300759) (← links)
- MgNet: a unified framework of multigrid and convolutional neural network (Q2316958) (← links)
- Distributed semi-supervised regression learning with coefficient regularization (Q2668180) (← links)
- Bias corrected regularization kernel method in ranking (Q4615656) (← links)
- (Q4969157) (← links)
- Balanced joint maximum mean discrepancy for deep transfer learning (Q4995048) (← links)
- Error analysis of the moving least-squares regression learning algorithm with <i>β</i>-mixing and non-identical sampling (Q5030625) (← links)
- (Q5053184) (← links)
- Modified proximal symmetric ADMMs for multi-block separable convex optimization with linear constraints (Q5075574) (← links)
- Neural network interpolation operators activated by smooth ramp functions (Q5083442) (← links)
- Weighted random sampling and reconstruction in general multivariate trigonometric polynomial spaces (Q5089731) (← links)
- (Q5104591) (← links)
- On the K-functional in learning theory (Q5107666) (← links)
- PhaseMax: Stable guarantees from noisy sub-Gaussian measurements (Q5132229) (← links)
- Approximation Properties of Ridge Functions and Extreme Learning Machines (Q5154637) (← links)
- Equivalence of approximation by convolutional neural networks and fully-connected networks (Q5218202) (← links)
- Online regularized pairwise learning with least squares loss (Q5220066) (← links)
- Deep neural networks for rotation-invariance approximation and learning (Q5236745) (← links)
- Robust randomized optimization with k nearest neighbors (Q5236747) (← links)
- Distributed Filtered Hyperinterpolation for Noisy Data on the Sphere (Q5855635) (← links)
- Learning rates for partially linear support vector machine in high dimensions (Q5856267) (← links)
- Approximation by max-product sampling Kantorovich operators with generalized kernels (Q5856314) (← links)
- Approximating functions with multi-features by deep convolutional neural networks (Q5873927) (← links)
- Compressed data separation via unconstrained l1-split analysis (Q5889890) (← links)
- Theory of deep convolutional neural networks. III: Approximating radial functions (Q6055154) (← links)
- Approximating smooth and sparse functions by deep neural networks: optimal approximation rates and saturation (Q6062170) (← links)
- Rates of approximation by ReLU shallow neural networks (Q6062171) (← links)
- Neural network interpolation operators optimized by Lagrange polynomial (Q6077041) (← links)
- Probabilistic robustness estimates for feed-forward neural networks (Q6079061) (← links)
- Convergence on sequences of Szász-Jakimovski-Leviatan type operators and related results (Q6112828) (← links)
- Convergence theorems in Orlicz and Bögel continuous functions spaces by means of Kantorovich discrete type sampling operators (Q6112839) (← links)
- Some new inequalities and numerical results of bivariate Bernstein-type operator including Bézier basis and its GBS operator (Q6112850) (← links)
- Rate of convergence of Stancu type modified \(q\)-Gamma operators for functions with derivatives of bounded variation (Q6112860) (← links)