A risk profile for information fusion algorithms
From MaRDI portal
Publication:400906
DOI10.3390/E13081518zbMATH Open1301.94048arXiv1105.5594OpenAlexW3098903841MaRDI QIDQ400906FDOQ400906
Herbert Landau, Kenric P. Nelson, Brian J. Scannell
Publication date: 26 August 2014
Published in: Entropy (Search for Journal in Brave)
Abstract: E.T. Jaynes, originator of the maximum entropy interpretation of statistical mechanics, emphasized that there is an inevitable trade-off between the conflicting requirements of robustness and accuracy for any inferencing algorithm. This is because robustness requires discarding of information in order to reduce the sensitivity to outliers. The principal of nonlinear statistical coupling, which is an interpretation of the Tsallis entropy generalization, can be used to quantify this trade-off. The coupled-surprisal, -ln_k (p)=-(p^k-1)/k, is a generalization of Shannon surprisal or the logarithmic scoring rule, given a forecast p of a true event by an inferencing algorithm. The coupling parameter k=1-q, where q is the Tsallis entropy index, is the degree of nonlinear coupling between statistical states. Positive (negative) values of nonlinear coupling decrease (increase) the surprisal information metric and thereby biases the risk in favor of decisive (robust) algorithms relative to the Shannon surprisal (k=0). We show that translating the average coupled-surprisal to an effective probability is equivalent to using the generalized mean of the true event probabilities as a scoring rule. The metric is used to assess the robustness, accuracy, and decisiveness of a fusion algorithm. We use a two-parameter fusion algorithm to combine input probabilities from N sources. The generalized mean parameter 'alpha' varies the degree of smoothing and raising to a power N^beta with beta between 0 and 1 provides a model of correlation.
Full work available at URL: https://arxiv.org/abs/1105.5594
Learning and adaptive systems in artificial intelligence (68T05) Measures of information, entropy (94A17) Foundations of equilibrium statistical mechanics (82B03)
Cites Work
- Title not available (Why is that?)
- Strictly Proper Scoring Rules, Prediction, and Estimation
- Combining Pattern Classifiers
- Possible generalization of Boltzmann-Gibbs statistics.
- On a \(q\) -central limit theorem consistent with nonextensive statistical mechanics
- Fundamental properties of Tsallis relative entropy
- Law of Error in Tsallis Statistics
- Integration of Stochastic Models by Minimizing α-Divergence
- Superstatistics
- A new one-parameter deformation of the exponential function
- Escort mean values and the characterization of power-law-decaying probability densities
- Scoring Rules, Generalized Entropy, and Utility Maximization
- On the generalized entropy pseudoadditivity for complex systems
- Central limit theorem and deformed exponentials
- The geometry of proper scoring rules
Cited In (3)
This page was built for publication: A risk profile for information fusion algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q400906)