Cohen's kappa coefficient is a numerical evaluation of inter-rater agreement or inter-annotator agreement for categorical entries.
20 patients are examined by two independent doctors.'Yes' denotes that the patients are diagnosed with disease X by a doctor.'No' denotes that the patients are classified as no disease X by a doctor.
Doctor A | ||||
---|---|---|---|---|
Yes | No | Total | ||
Doctor B | Yes | 12 | 2 | 14 ( 70 %) |
No | 4 | 2 | 6 ( 30 %) | |
Total | 16 ( 80 %) | 4 ( 20 %) | 20 |
Substitute the given values in the formula,
k = ( Pr (a) - Pr (e) ) / ( 1 - Pr (e) )
Pr(a) = ( 12 + 2 ) / 20 = 0.7
Pr(e) = ( 0.7 * 0.8 ) + ( 0.2 * 0.3 ) = 0.62
k = ( 0.7 - 0.62 ) / ( 1 - 0.62 ) = 0.08 / 0.38
k = 0.21
Hence the Cohen's kappa index value is calculated.