This hypothetical example shows that the proportion of cases where two evaluators issue the same valuations for an instrument is accidentally inflated by the agreement. This random agreement must be removed in order to provide a valid level of compliance. Cohen`s kappa coefficient is used to assess the degree of compliance beyond random concordance. A typical 2×2 emergency table to assess the correspondence of two evaluators Example 1. Step 2: Find the probability that the evaluators will say yes by chance. Rater A said yes at 25/50 images or 50% (0.5). Rater B said yes to 30/50 images or 60% (0.6%). The total probability that evaluators will say yes by chance is 0.5 * 0.6 = 0.30. Therefore, the probability that both by chance will say yes is 0.50 * 0.60 = 0.30 and the probability that both will say “no” is 0.50 * 0.40 = 0.20. .

. .