Measuring pairwise interobserver agreement when all subjects are judged by the same observers
From MaRDI portal
(Redirected from Publication:135073)
Cites work
Cited in
(6)- Agreement between an isolated rater and a group of raters
- Assessing the reliability of ordered categorical scales using kappa-type statistics
- Statistical description of interrater variability in ordinal ratings
- Agreement between two independent groups of raters
- A new approach to inter-rater agreement through stochastic orderings: the discrete case
- A paired kappa to compare binary ratings across two medical tests
This page was built for publication: Measuring pairwise interobserver agreement when all subjects are judged by the same observers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q135073)