A risk profile for information fusion algorithms (Q400906): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
ReferenceBot (talk | contribs)
Changed an Item
 
(3 intermediate revisions by 3 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W3098903841 / rank
 
Normal rank
Property / arXiv ID
 
Property / arXiv ID: 1105.5594 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The geometry of proper scoring rules / rank
 
Normal rank
Property / cites work
 
Property / cites work: Strictly Proper Scoring Rules, Prediction, and Estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Scoring Rules, Generalized Entropy, and Utility Maximization / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the generalized entropy pseudoadditivity for complex systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Fundamental properties of Tsallis relative entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Law of Error in Tsallis Statistics / rank
 
Normal rank
Property / cites work
 
Property / cites work: On a \(q\) -central limit theorem consistent with nonextensive statistical mechanics / rank
 
Normal rank
Property / cites work
 
Property / cites work: Combining Pattern Classifiers / rank
 
Normal rank
Property / cites work
 
Property / cites work: Integration of Stochastic Models by Minimizing <i>α</i>-Divergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Possible generalization of Boltzmann-Gibbs statistics. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Central limit theorem and deformed exponentials / rank
 
Normal rank
Property / cites work
 
Property / cites work: Superstatistics / rank
 
Normal rank
Property / cites work
 
Property / cites work: A new one-parameter deformation of the exponential function / rank
 
Normal rank
Property / cites work
 
Property / cites work: Escort mean values and the characterization of power-law-decaying probability densities / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4398828 / rank
 
Normal rank

Latest revision as of 23:06, 8 July 2024

scientific article
Language Label Description Also known as
English
A risk profile for information fusion algorithms
scientific article

    Statements

    A risk profile for information fusion algorithms (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    26 August 2014
    0 references
    Summary: E. T. Jaynes, originator of the maximum entropy interpretation of statistical mechanics, emphasized that there is an inevitable trade-off between the conflicting requirements of robustness and accuracy for any inferencing algorithm. This is because robustness requires discarding of information in order to reduce the sensitivity to outliers. The principal of nonlinear statistical coupling, which is an interpretation of the Tsallis entropy generalization, can be used to quantify this trade-off. The coupled-surprisal, \(-\ln_\kappa(p)\equiv-\frac{(p^\kappa-1)}{\kappa}\), is a generalization of Shannon surprisal or the logarithmic scoring rule, given a forecast \(p\) of a true event by an inferencing algorithm. The coupling parameter \(\kappa=1-q\), where \(q\) is the Tsallis entropy index, is the degree of nonlinear coupling between statistical states. Positive (negative) values of nonlinear coupling decrease (increase) the surprisal information metric and thereby biases the risk in favor of decisive (robust) algorithms relative to the Shannon surprisal \((\kappa=0)\). We show that translating the average coupled-surprisal to an effective probability is equivalent to using the generalized mean of the true event probabilities as a scoring rule. The metric is used to assess the robustness, accuracy, and decisiveness of a fusion algorithm. We use a two-parameter fusion algorithm to combine input probabilities from \(N\) sources. The generalized mean parameter `alpha' varies the degree of smoothing and raising to a power \(N^\beta\) with \(\beta\) between 0 and 1 provides a model of correlation.
    0 references
    Tsallis entropy
    0 references
    proper scoring rules
    0 references
    information fusion
    0 references
    machine learning
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references