On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
DOI10.1016/0893-6080(94)90040-XzbMATH Open0817.62031OpenAlexW2051688774MaRDI QIDQ1345261FDOQ1345261
Authors: Adam Krzyżak, Alan Yuille, Lei Xu
Publication date: 2 July 1995
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0893-6080(94)90040-x
Recommendations
- scientific article; zbMATH DE number 1735798
- Convergence and rates of convergence of recursive radial basis functions networks in function learning and classification
- scientific article; zbMATH DE number 1156639
- Learning and Convergence of the Normalized Radial Basis Functions Networks
- scientific article; zbMATH DE number 1439407
consistencyleast squares estimatorupper boundsuniversal approximationkernel regression estimatorsRBF netsbest consistent estimatorconvergence rates of the approximation errorParzen window estimatorradial basis function netsreceptive field size
Density estimation (62G07) Asymptotic properties of nonparametric inference (62G20) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Regularization algorithms for learning that are equivalent to multilayer networks
- Title not available (Why is that?)
- Probability Inequalities for Sums of Bounded Random Variables
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the almost everywhere convergence of nonparametric regression function estimates
- Probability Inequalities for the Sum of Independent Random Variables
- Multilayer feedforward networks are universal approximators
- Title not available (Why is that?)
- An equivalence theorem for \(L_ 1\) convergence of the kernel regression estimate
- Title not available (Why is that?)
- On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
- The rates of convergence of kernel regression estimates and classification rules
- Networks and the best approximation property
- Distribution-free pointwise consistency of kernel regression estimate
- The pointwise rate of convergence of the kernel regression estimate
- Title not available (Why is that?)
- On exponential bounds on the Bayes risk of the kernel classification rule
Cited In (13)
- Pattern recognition with ordered labels
- Title not available (Why is that?)
- Flexible regression modeling
- Random Projection RBF Nets for Multidimensional Density Estimation
- On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
- On Learning and Convergence of RBF Networks in Regression Estimation and Classification
- Kernel Based Learning Methods: Regularization Networks and RBF Networks
- On-line RBFNN based identification of rapidly time-varying nonlinear systems with optimal structure-adaptation.
- On almost sure convergence and rates of radial bases function networks classifiers
- On Different Facets of Regularization Theory
- A heteroscedasticity diagnostic of a regression analysis with copula dependent random variables
- Learning and Convergence of the Normalized Radial Basis Functions Networks
- Sensitivity analysis applied to the construction of radial basis function networks
This page was built for publication: On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1345261)