Efficient approximation of the conditional relative entropy with applications to discriminative learning of Bayesian network classifiers
DOI10.3390/E15072716zbMATH Open1398.94071OpenAlexW2065733810WikidataQ59196638 ScholiaQ59196638MaRDI QIDQ280459FDOQ280459
Paulo Mateus, Alexandra M. Carvalho, P. Adão
Publication date: 10 May 2016
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e15072716
Recommendations
- Discriminative learning of Bayesian networks via factorized conditional log-likelihood
- On discriminative Bayesian network classifiers and logistic regression
- Efficient parameter learning of Bayesian network classifiers
- scientific article; zbMATH DE number 1927373
- Structural extension to logistic regression: Discriminative parameter learning of belief net classifiers
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Measures of information, entropy (94A17)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- 10.1162/153244302760200696
- Approximating discrete probability distributions with dependence trees
- Probabilistic graphical models.
- Bayesian network classifiers
- Optimum branchings
- On the optimality of the simple Bayesian classifier under zero-one loss
- Title not available (Why is that?)
- Title not available (Why is that?)
- Approximating probability distributions to reduce storage requirements
- Title not available (Why is that?)
- Approximating probabilistic inference in Bayesian belief networks is NP- hard
Cited In (1)
This page was built for publication: Efficient approximation of the conditional relative entropy with applications to discriminative learning of Bayesian network classifiers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q280459)