public class CohensKappaInterraterAgreement extends Object
Cohen's Kappa is defined as: (PrA - PrE) / (1 - PrE) where PrA is the percentage agreement, and PrE is the probability of random agreement. PrA = agreement / total and PrE = PrX + PrY where PrX and PrY are the probability of both agreeing on X or both agreeing on Y randomly (that is, Pr(r1,x)*Pr(r2,x) ... )
Constructor and Description |
---|
CohensKappaInterraterAgreement() |
Modifier and Type | Method and Description |
---|---|
static <K,A> double |
calculate(Map<K,A> rater1,
Map<K,A> rater2)
The input should be a
Map for each rater where the keys represent
all the subjects that were rated by the raters and the values represent
the annotations given by the raters. |
public CohensKappaInterraterAgreement()
public static <K,A> double calculate(Map<K,A> rater1, Map<K,A> rater2)
Map
for each rater where the keys represent
all the subjects that were rated by the raters and the values represent
the annotations given by the raters. Agreement between the raters is
determined by Object.equals(Object)
for the INSTANCE type. Annotations
for subjects which are not in both sets are ignored.rater1
- The annotations from rater 1rater2
- The annotations from rater 2