On information and distance measures, error bounds, and feature selection
From MaRDI portal
Publication:1228464
DOI10.1016/S0020-0255(76)90746-5zbMath0333.94007OpenAlexW2006087861MaRDI QIDQ1228464
Publication date: 1976
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0020-0255(76)90746-5
Research exposition (monographs, survey articles) pertaining to information and communication theory (94-02) Information theory (general) (94A15)
Related Items (12)
Information of degree \(\beta\) and probability of error ⋮ Generalized divergence measures and the probability of error ⋮ A discriminant analysis using composite features for classification problems ⋮ Some aspects of error bounds in feature selection ⋮ A combined algorithm for weighting the variables and clustering in the clustering problem ⋮ A class of lower bounds on the Bayesian probability of error ⋮ Mahalanobis distance-based two new feature evaluation criteria ⋮ Toward a tight upper bound for the error probability of the binary Gaussian classification problem ⋮ Bootstrap methods for the empirical study of decision-making and information flows in social systems ⋮ A generalized class of certainty and information measures ⋮ Trigonometric entropies, Jensen difference divergence measures, and error bounds ⋮ The ø‐Entropy in the Selection of a Fixed Number of Experiments
Cites Work
- On a class of computationally efficient feature selection criteria
- A class of measures of informativity of observation channels
- Patterns in pattern recognition: 1968-1974
- On a New Class of Bounds on Bayes Risk in Multihypothesis Pattern Recognition
- Note on discrimination information and variation (Corresp.)
- Probability of Error Bounds
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: On information and distance measures, error bounds, and feature selection