Measuring Agreement for Multinomial Data
From MaRDI portal
Publication:3966937
DOI10.2307/2529886zbMath0501.62045MaRDI QIDQ3966937
Publication date: 1982
Published in: Biometrics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.2307/2529886
reliability; multinomial data; complete independence; asymptotic standard error; two-way layout; measure of agreement; intraclass correlation coefficients; kappa-like statistic
62P10: Applications of statistics to biology and medical sciences; meta analysis
62H20: Measures of association (correlation, canonical correlation, etc.)
62P15: Applications of statistics to psychology
Related Items
Confidence intervals for the interrater agreement measure kappa, Statistical description of interrater variability in ordinal ratings, On the equivalence of multirater kappas based on 2-agreement and 3-agreement with binary scores, Cohen's linearly weighted kappa is a weighted average of \(2\times 2\) kappas, Modeling the agreement of discrete bivariate survival times using kappa coefficient, Inequalities between multi-rater kappas, Beyond kappa: A review of interrater agreement measures