Pages that link to "Item:Q3966937"
From MaRDI portal
The following pages link to Measuring Agreement for Multinomial Data (Q3966937):
Displayed 16 items.
- Sklar's Omega: A Gaussian Copula-Based Framework for Assessing Agreement (Q65362) (← links)
- Corrected Zegers-ten Berge coefficients are special cases of Cohen's weighted kappa (Q288981) (← links)
- Bayesian testing of agreement criteria under order constraints (Q508108) (← links)
- On the equivalence of multirater kappas based on 2-agreement and 3-agreement with binary scores (Q1952677) (← links)
- A comparison of reliability coefficients for ordinal rating scales (Q2075724) (← links)
- Cohen's linearly weighted kappa is a weighted average of \(2\times 2\) kappas (Q2275430) (← links)
- A family of multi-rater kappas that can always be increased and decreased by combining categories (Q2360886) (← links)
- Equivalences of weighted kappas for multiple raters (Q2360892) (← links)
- Modeling the agreement of discrete bivariate survival times using kappa coefficient (Q2432620) (← links)
- Inequalities between multi-rater kappas (Q2442790) (← links)
- Assessing agreement with multiple raters on correlated kappa statistics (Q3188707) (← links)
- Beyond kappa: A review of interrater agreement measures (Q4262085) (← links)
- Confidence intervals for the interrater agreement measure kappa (Q4721489) (← links)
- Weighted kappa as a function of unweighted kappas (Q4976567) (← links)
- Statistical inference of agreement coefficient between two raters with binary outcomes (Q5077201) (← links)
- Statistical description of interrater variability in ordinal ratings (Q5424028) (← links)