Iterative principles of recognition in probabilistic neural networks
DOI10.1016/J.NEUNET.2008.03.002zbMATH Open1254.68205DBLPjournals/nn/GrimH08OpenAlexW2131084184WikidataQ51887672 ScholiaQ51887672MaRDI QIDQ1932032FDOQ1932032
Publication date: 17 January 2013
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2008.03.002
Recommendations
EM algorithmprobabilistic neural networksdistribution mixturesrecognition of numeralsrecurrent reasoning
Learning and adaptive systems in artificial intelligence (68T05) Pattern recognition, speech recognition (68T10)
Cites Work
- Title not available (Why is that?)
- Finite mixture models
- Title not available (Why is that?)
- Multiple classifier fusion in probabilistic neural networks
- Title not available (Why is that?)
- Bayesian classification by iterated weighting
- AN ITERATIVE INFERENCE MECHANISM FOR THE PROBABILISTIC EXPERT SYSTEM PES
- Title not available (Why is that?)
- About the maximum information and maximum likelihood principles.
Cited In (7)
- Parallel Processing and Applied Mathematics
- G-PNN: a genetically engineered probabilistic neural network
- Title not available (Why is that?)
- Multiple classifier fusion in probabilistic neural networks
- A simple probabilistic neural network for machine understanding
- Probabilistic neural network with homogeneity testing in recognition of discrete patterns set
- Neuromorphic features of probabilistic neural networks
This page was built for publication: Iterative principles of recognition in probabilistic neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1932032)