Currently displaying 1 – 1 of 1

Showing per page

Order by Relevance | Title | Year of publication

Models of assessing the extent of agreement beteween raters using kappa coefficients

Joanna Jarosz-Nowak — 2007

Mathematica Applicanda

In medical studies quality of assessment is of great importance. Typically it is characterized by reliability. A fundamental sense of interobserver reliability is to evaluate a degree of agreement between independent judges examining the same objects. The paper is an evaluation of some interrater reliability measures and their interpretation. Cohen’s kappa and Scott’s coefficient are considered. We describe models and connections between coefficients defined for dichotomous and politomous data....

Page 1

Download Results (CSV)