Learning and Convergence of the Normalized Radial Basis Functions Networks
Publication:5881515
DOI10.1007/978-3-319-91253-0_12OpenAlexW2802719061MaRDI QIDQ5881515
Marian A. Partyka, Adam Krzyżak
Publication date: 10 March 2023
Published in: Artificial Intelligence and Soft Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-319-91253-0_12
classificationstrong convergencerates of convergencenonlinear regressionMISE convergenceWolverton-Wagner recursive radial basis function networks
Classification and discrimination; cluster analysis (statistical aspects) (62H30) General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Lectures on the nearest neighbor method
- Fourier and Hermite series estimates of regression functions
- An equivalence theorem for \(L_ 1\) convergence of the kernel regression estimate
- On the asymptotic normality of the \(L_2\)-error in partitioning regression estimation
- On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
- Multilayer feedforward networks are universal approximators
- On the strong universal consistency of nearest neighbor regression function estimates
- A distribution-free theory of nonparametric regression
- Convergence and rates of convergence of radial basis functions networks in function learning.
- Consistency of random forests
- Weighted sums of certain dependent random variables
- Distribution-free consistency of a nonparametric kernel regression estimate and classification
- Nonparametric Regression Estimation by Normalized Radial Basis Function Networks
- Necessary and sufficient conditions for Bayes risk consistency of a recursive kernel classification rule (Corresp.)
- Any Discrimination Rule Can Have an Arbitrarily Bad Probability of Error for Finite Sample Size
- Global convergence of the recursive kernel regression estimates with applications in classification and nonlinear system estimation
- On the L 1 convergence of kernel estimators of regression functions with applications in discrimination
- Sur l’erreur d’interpolation des fonctions de plusieurs variables par les $D^m$-splines
- Universal approximation bounds for superpositions of a sigmoidal function
- Strong universal consistency of neural network classifiers
- The rates of convergence of kernel regression estimates and classification rules
- Nonparametric estimation via empirical risk minimization
- Neural Network Learning
- Convergence and Rates of Convergence of Recursive Radial Basis Functions Networks in Function Learning and Classification
- Nonparametric Regression Based on Hierarchical Interaction Models
- Estimation of Dependences Based on Empirical Data
- Asymptotically optimal discriminant functions for pattern classification
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Approximation by superpositions of a sigmoidal function
- Random forests
This page was built for publication: Learning and Convergence of the Normalized Radial Basis Functions Networks