Information criteria in classification: new divergence-based classifiers
From MaRDI portal
Publication:5033469
DOI10.1080/00949655.2020.1798445OpenAlexW3045795872MaRDI QIDQ5033469FDOQ5033469
Authors: William D. A. Rodríguez, Getulio J. A. Amaral, Jodavid A. Ferreira, Abraão D. C. Nascimento
Publication date: 23 February 2022
Published in: Journal of Statistical Computation and Simulation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00949655.2020.1798445
Cites Work
- An introduction to statistical learning. With applications in R
- Some Comments on C P
- Nearest neighbor pattern classification
- On Information and Sufficiency
- Sparse semiparametric discriminant analysis
- Goodness-of-fit tests via phi-divergences
- Title not available (Why is that?)
- Interpreting Kullback--Leibler divergence with the Neyman-Pearson Lemma
- Title not available (Why is that?)
- The multivariate skew-normal distribution
- Title not available (Why is that?)
- Title not available (Why is that?)
- Robust tests based on dual divergence estimators and saddlepoint approximations
- Minimum Cross-Entropy Pattern Classification and Cluster Analysis
- Bayesian additive machine: classification with a semiparametric discriminant function
Cited In (4)
- A consistent information criterion for support vector machines in diverging model spaces
- A family of the information criteria using the phi-divergence for categorical data
- ARMA process for speckled data
- Divergence-based tests for the bivariate gamma distribution applied to polarimetric synthetic aperture radar
Uses Software
This page was built for publication: Information criteria in classification: new divergence-based classifiers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5033469)