Information indices: Unification and applications.
From MaRDI portal
Publication:1858923
DOI10.1016/S0304-4076(01)00111-7zbMath1051.62006MaRDI QIDQ1858923
Ehsan S. Soofi, Joseph J. Retzer
Publication date: 17 February 2003
Published in: Journal of Econometrics (Search for Journal in Brave)
Applications of statistics to economics (62P20) Statistical aspects of information-theoretic topics (62B10)
Related Items
On local divergences between two probability measures, On the Choice of Nonparametric Entropy Estimator in Entropy-Based Goodness-of-Fit Test Statistics, Information measures for generalized gamma family, Asymmetric Laplace regression: maximum likelihood, maximum entropy and quantile regression, An Estimator of Shannon Entropy of Beta-Generated Distributions and a Goodness-of-Fit Test, Ranking Forecasts by Stochastic Error Distance, Information and Reliability Measures, Bayes Estimate and Inference for Entropy and Information Index of Fit, Censored Kullback-Leibler Information and Goodness-of-Fit Test with Type II Censored Data, On association in regression: the coefficient of determination revisited, Predictor relative importance and matching regression parameters, A microeconomic interpretation of the maximum entropy estimator of multinomial logit models and its equivalence to the maximum likelihood estimator, An efficient correction to the density-based empirical likelihood ratio goodness-of-fit test for the inverse Gaussian distribution, Implications of quantal response statistical equilibrium, Information importance of predictors: concept, measures, Bayesian inference, and applications, Executives' perceived environmental uncertainty shortly after 9/11, On the General Class of Two-Sided Power Distribution, Two-Sided Generalized Exponential Distribution, Information measures of Dirichlet distribution with applications, Information and entropy econometrics -- editor's view., Comparison of maximum entropy and higher-order entropy estimators.
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Effects of collinearity on information about regression coefficients
- On entropy-based goodness-of-fit tests
- How many bits of information does an independent variable yield in a multiple regression?
- Information theory as a unifying statistical approach for use in marketing research
- Indicator and filter attributes of monetary aggregates
- I-divergence geometry of probability distributions and minimization problems
- Expected information as ecpected utility
- On the estimation of entropy
- An \(R\)-squared measure of goodness of fit for some common nonlinear regression models
- Computation of maximum entropy Dirichlet for modeling lifetime data
- Entropy, divergence and distance measures with econometric applications
- Prediction Variance and Information Worth of Observations in Time Series
- Diagnostic Measures for Model Criticism
- A Maximum Entropy Approach to Recovering Information From Multinomial Response Data
- Additivity of Information in Exponential Family Probability Laws
- On a Measure of the Information Provided by an Experiment
- ON THE ANALYSIS OF MULTIPLE REGRESSION IN k CATEGORIES
- Information Theory and Statistical Mechanics
- A Closer Look at the Deviance
- Relative Entropy Measures of Multivariate Dependence
- Entropy-Based Tests of Uniformity
- USING THE MUTUAL INFORMATION COEFFICIENT TO IDENTIFY LAGS IN NONLINEAR MODELS
- Capturing the Intangible Concept of Information
- Principal Information Theoretic Approaches
- A compendium to information theory in economics and econometrics
- A Predictive View of the Detection and Characterization of Influential Observations in Regression Analysis
- Information Distinguishability with Application to Analysis of Failure Data
- Distinguishability of Sets of Distributions
- Prior Probabilities
- Marginal Homogeneity of Multidimensional Contingency Tables
- Consistent Nonparametric Entropy-Based Testing
- On Information and Sufficiency
- Certain Inequalities in Information Theory and the Cramer-Rao Inequality
- An invariant form for the prior probability in estimation problems