The role of mutual information in variational classifiers
DOI10.1007/S10994-023-06337-6zbMATH Open1518.68325arXiv2010.11642OpenAlexW3094116343MaRDI QIDQ6134364FDOQ6134364
Authors: Matías Vera, Leonardo Rey Vega, Pablo Piantanida
Publication date: 22 August 2023
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2010.11642
Recommendations
- Entropy and mutual information in models of deep neural networks*
- Emergence of invariance and disentanglement in deep representations
- On inequalities between mutual information and variation
- Learning and Generalization with the Information Bottleneck
- Learning and generalization with the information bottleneck
information theorygeneralization errorPAC learninginformation bottleneckcross-entropy lossvariational classifiers
Statistical aspects of information-theoretic topics (62B10) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Measures of information, entropy (94A17)
Cites Work
- Title not available (Why is that?)
- Joint maximization of accuracy and information for learning the structure of a Bayesian network classifier
- Elements of Information Theory
- Title not available (Why is that?)
- 10.1162/153244302760200704
- Deep learning
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Emergence of invariance and disentanglement in deep representations
- Training Products of Experts by Minimizing Contrastive Divergence
- Asymptotic evaluation of certain markov process expectations for large time. IV
- A Fast Learning Algorithm for Deep Belief Nets
- Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion
- A learning criterion for stochastic rules
- Robustness and generalization
- Learning and generalization with the information bottleneck
- PAC-Bayesian compression bounds on the prediction error of learning algorithms for classification
- Robust Large Margin Deep Neural Networks
- Foundations of machine learning
- The minimax learning rates of normal and Ising undirected graphical models
- On the information bottleneck theory of deep learning
- Learners that use little information
- How Much Does Your Data Exploration Overfit? Controlling Bias via Information Usage
Cited In (1)
This page was built for publication: The role of mutual information in variational classifiers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6134364)