On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
From MaRDI portal
(Redirected from Publication:1345261)
Recommendations
- scientific article; zbMATH DE number 1735798
- Convergence and rates of convergence of recursive radial basis functions networks in function learning and classification
- scientific article; zbMATH DE number 1156639
- Learning and Convergence of the Normalized Radial Basis Functions Networks
- scientific article; zbMATH DE number 1439407
Cites work
- scientific article; zbMATH DE number 4040466 (Why is no real title available?)
- scientific article; zbMATH DE number 4074523 (Why is no real title available?)
- scientific article; zbMATH DE number 3563431 (Why is no real title available?)
- scientific article; zbMATH DE number 3620754 (Why is no real title available?)
- scientific article; zbMATH DE number 4001210 (Why is no real title available?)
- scientific article; zbMATH DE number 4183235 (Why is no real title available?)
- An equivalence theorem for \(L_ 1\) convergence of the kernel regression estimate
- Distribution-free pointwise consistency of kernel regression estimate
- Multilayer feedforward networks are universal approximators
- Networks and the best approximation property
- On exponential bounds on the Bayes risk of the kernel classification rule
- On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
- On the almost everywhere convergence of nonparametric regression function estimates
- Probability Inequalities for Sums of Bounded Random Variables
- Probability Inequalities for the Sum of Independent Random Variables
- Regularization algorithms for learning that are equivalent to multilayer networks
- The pointwise rate of convergence of the kernel regression estimate
- The rates of convergence of kernel regression estimates and classification rules
Cited in
(13)- Flexible regression modeling
- On Learning and Convergence of RBF Networks in Regression Estimation and Classification
- Learning and Convergence of the Normalized Radial Basis Functions Networks
- Kernel Based Learning Methods: Regularization Networks and RBF Networks
- Pattern recognition with ordered labels
- Random Projection RBF Nets for Multidimensional Density Estimation
- On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
- On Different Facets of Regularization Theory
- scientific article; zbMATH DE number 778115 (Why is no real title available?)
- On-line RBFNN based identification of rapidly time-varying nonlinear systems with optimal structure-adaptation.
- A heteroscedasticity diagnostic of a regression analysis with copula dependent random variables
- On almost sure convergence and rates of radial bases function networks classifiers
- Sensitivity analysis applied to the construction of radial basis function networks
This page was built for publication: On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1345261)