On the equivalence of multirater kappas based on 2-agreement and 3-agreement with binary scores (Q1952677)

From MaRDI portal
scientific article
Language Label Description Also known as
English
On the equivalence of multirater kappas based on 2-agreement and 3-agreement with binary scores
scientific article

    Statements

    On the equivalence of multirater kappas based on 2-agreement and 3-agreement with binary scores (English)
    0 references
    0 references
    3 June 2013
    0 references
    Summary: Cohen's kappa is a popular descriptive statistic for summarizing agreement between the classifications of two raters on a nominal scale. With \(m \geq 3\) raters there are several views in the literature on how to define agreement. The concept of \(g\)-agreement \((g \in \{2, 3, \dots, m\})\) refers to the situation in which it is decided that there is agreement if \(g\) out of \(m\) raters assign an object to the same category. Given \(m \geq 2\) raters we can formulate \(m - 1\) multirater kappas, one based on 2-agreement, one based on 3-agreement, and so on, and one based on \(m\)-agreement. It is shown that if the scale consists of only two categories the multi-rater kappas based on 2-agreement and 3-agreement are identical.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references