scientific article; zbMATH DE number 877623
From MaRDI portal
Publication:4877337
zbMATH Open0849.68103MaRDI QIDQ4877337FDOQ4877337
Authors: Gustavo Deco, Dragan Obradovic
Publication date: 12 May 1996
Title of this publication is not available (Why is that?)
Recommendations
Learning and adaptive systems in artificial intelligence (68T05) Research exposition (monographs, survey articles) pertaining to computer science (68-02)
Cited In (15)
- Some results on Tsallis entropy measure and \(k\)-record values
- Information-theoretic self-compression of multi-layered neural networks
- Information theoretic learning. Renyi's entropy and kernel perspectives
- Tsallis entropy measure of noise-aided information transmission in a binary channel
- Coherent infomax as a computational goal for neural systems
- On the information bottleneck theory of deep learning
- Maximal Information Divergence from Statistical Models Defined by Neural Networks
- Simplified information maximization for improving generalization performance in multilayered neural networks
- Accelerated information gradient flow
- An information-geometric approach to a theory of pragmatic structuring
- Structure-selection techniques applied to continuous-time nonlinear models
- Entropy and mutual information in models of deep neural networks*
- Quantifying Neurotransmission Reliability Through Metrics-Based Information Analysis
- The neural network as a renormalizer of information
- Title not available (Why is that?)
This page was built for publication:
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4877337)