Shannon's Entropy and Its Generalisations Towards Statistical Inference in Last Seven Decades
From MaRDI portal
Publication:6088260
DOI10.1111/insr.12374OpenAlexW3014944827MaRDI QIDQ6088260
Asok K. Nanda, Shovan Chowdhury
Publication date: 13 December 2023
Published in: International Statistical Review (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/insr.12374
estimationKullback-Leibler divergencekernel estimatorgoodness-of-fit testresidual entropycross entropy
Nonparametric hypothesis testing (62G10) Research exposition (monographs, survey articles) pertaining to statistics (62-02) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Escort distributions minimizing the Kullback-Leibler divergence for a large deviations principle and tests of entropy level
- Applications of entropy in finance: a review
- Entropy differential metric, distance and divergence measures in probability spaces: A unified approach
- A two-sample empirical likelihood ratio test based on samples entropy
- Tests of symmetry based on the sample entropy of order statistics and power comparison
- Estimation of entropy and other functionals of a multivariate density
- Empirical likelihood ratios applied to goodness-of-fit tests based on sample entropy
- Information theory as a unifying statistical approach for use in marketing research
- Descriptive statistics for nonparametric models. I: Introduction
- On the estimation of entropy
- Renyi's entropy as an index of diversity in simple-stage cluster sampling
- The generalized entropy measure to the design and comparison of regression experiment in a Bayesian context
- Information theoretic framework for process control
- Moment-entropy inequalities.
- A test for multivariate normality based on sample entropy and projection pursuit
- Entropy, divergence and distance measures with econometric applications
- A goodness-of-fit test for normality based on the sample entropy of order statistics
- Multiattribute decision making based on Shannon's information entropy, non-linear programming methodology, and interval-valued intuitionistic fuzzy values
- AECID: asymmetric entropy for classifying imbalanced data
- On censored cumulative residual Kullback-Leibler information and goodness-of-fit test with type II censored data
- Determination of entropy measures for the ordinal scale-based linguistic models
- Rényi information measure for a used item
- Residual and past entropy in actuarial science and survival models
- On empirical cumulative residual entropy and a goodness-of-fit test for exponentiality
- A class of measures of informativity of observation channels
- ENTROPY-BASED GOODNESS OF FIT TEST FOR A COMPOSITE HYPOTHESIS
- Goodness-of-Fit Test for Exponentiality Based on Kullback–Leibler Information
- Remarks on Some Nonparametric Estimates of a Density Function
- Some tests for the power series distibutions in one parameter using the kullback-leibler information measure
- Cumulative Residual Entropy: A New Measure of Information
- A Goodness-of-Fit Test for the Gumbel Distribution Based on Kullback–Leibler Information
- Prediction and Entropy of Printed English
- On the convexity of some divergence measures based on entropy functions
- Entropy-Based Tests of Uniformity
- Limit Distributions for a Statistical Estimate of the Entropy
- A nonparametric estimation of the entropy for absolutely continuous distributions (Corresp.)
- Pair Rank Set Sampling
- KERNEL ESTIMATION OF RESIDUAL ENTROPY
- A new class of random vector entropy estimators and its applications in testing statistical hypotheses
- New partial ordering of survival functions based on the notion of uncertainty
- Bootstrap entropy test for general location-scale time series models with heteroscedasticity
- Censored Kullback-Leibler Information and Goodness-of-Fit Test with Type II Censored Data
- General treatment of goodness-of-fit tests based on Kullback–Leibler information
- Entropy estimation and goodness-of-fit tests for the inverse Gaussian and Laplace distributions using paired ranked set sampling
- An analysis of variance test for normality (complete samples)
- Elements of Information Theory
- On Communication of Analog Data from a Bounded Source Space
- On Estimation of a Probability Density Function and Mode
- Rényi information, loglikelihood and an intrinsic distribution measure