A risk profile for information fusion algorithms (Q400906)

From MaRDI portal
scientific article
Language Label Description Also known as
English
A risk profile for information fusion algorithms
scientific article

    Statements

    A risk profile for information fusion algorithms (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    26 August 2014
    0 references
    Summary: E. T. Jaynes, originator of the maximum entropy interpretation of statistical mechanics, emphasized that there is an inevitable trade-off between the conflicting requirements of robustness and accuracy for any inferencing algorithm. This is because robustness requires discarding of information in order to reduce the sensitivity to outliers. The principal of nonlinear statistical coupling, which is an interpretation of the Tsallis entropy generalization, can be used to quantify this trade-off. The coupled-surprisal, \(-\ln_\kappa(p)\equiv-\frac{(p^\kappa-1)}{\kappa}\), is a generalization of Shannon surprisal or the logarithmic scoring rule, given a forecast \(p\) of a true event by an inferencing algorithm. The coupling parameter \(\kappa=1-q\), where \(q\) is the Tsallis entropy index, is the degree of nonlinear coupling between statistical states. Positive (negative) values of nonlinear coupling decrease (increase) the surprisal information metric and thereby biases the risk in favor of decisive (robust) algorithms relative to the Shannon surprisal \((\kappa=0)\). We show that translating the average coupled-surprisal to an effective probability is equivalent to using the generalized mean of the true event probabilities as a scoring rule. The metric is used to assess the robustness, accuracy, and decisiveness of a fusion algorithm. We use a two-parameter fusion algorithm to combine input probabilities from \(N\) sources. The generalized mean parameter `alpha' varies the degree of smoothing and raising to a power \(N^\beta\) with \(\beta\) between 0 and 1 provides a model of correlation.
    0 references
    0 references
    0 references
    0 references
    0 references
    Tsallis entropy
    0 references
    proper scoring rules
    0 references
    information fusion
    0 references
    machine learning
    0 references
    0 references
    0 references