Information criteria in classification: new divergence-based classifiers
From MaRDI portal
Publication:5033469
Cites work
- scientific article; zbMATH DE number 3173999 (Why is no real title available?)
- scientific article; zbMATH DE number 47593 (Why is no real title available?)
- scientific article; zbMATH DE number 835699 (Why is no real title available?)
- scientific article; zbMATH DE number 3241743 (Why is no real title available?)
- An introduction to statistical learning. With applications in R
- Bayesian additive machine: classification with a semiparametric discriminant function
- Goodness-of-fit tests via phi-divergences
- Interpreting Kullback--Leibler divergence with the Neyman-Pearson Lemma
- Minimum Cross-Entropy Pattern Classification and Cluster Analysis
- Nearest neighbor pattern classification
- On Information and Sufficiency
- Robust tests based on dual divergence estimators and saddlepoint approximations
- Some Comments on C P
- Sparse semiparametric discriminant analysis
- The multivariate skew-normal distribution
Cited in
(4)- A family of the information criteria using the phi-divergence for categorical data
- Divergence-based tests for the bivariate gamma distribution applied to polarimetric synthetic aperture radar
- A consistent information criterion for support vector machines in diverging model spaces
- ARMA process for speckled data
This page was built for publication: Information criteria in classification: new divergence-based classifiers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5033469)