![Reliability of Nominal Data Based on Qualitative Judgments - William D. Perreault, Laurence E. Leigh, 1989 Reliability of Nominal Data Based on Qualitative Judgments - William D. Perreault, Laurence E. Leigh, 1989](https://journals.sagepub.com/cms/10.1177/002224378902600201/asset/images/large/10.1177_002224378902600201-fig1.jpeg)
Reliability of Nominal Data Based on Qualitative Judgments - William D. Perreault, Laurence E. Leigh, 1989
The Equivalence of Weighted Kappa and the Intraclass Correlation Coefficient as Measures of Reliability - Joseph L. Fleiss, Jacob Cohen, 1973
![28. Kappa measure for Interjudge (dis)agreement for Accessing Relevance in Information Retrieval - YouTube 28. Kappa measure for Interjudge (dis)agreement for Accessing Relevance in Information Retrieval - YouTube](https://i.ytimg.com/vi/xteqXnaes7c/maxresdefault.jpg)
28. Kappa measure for Interjudge (dis)agreement for Accessing Relevance in Information Retrieval - YouTube
![28. Kappa measure for Interjudge (dis)agreement for Accessing Relevance in Information Retrieval - YouTube 28. Kappa measure for Interjudge (dis)agreement for Accessing Relevance in Information Retrieval - YouTube](https://i.ytimg.com/vi/TNej9Ax8eYU/hq720.jpg?sqp=-oaymwE7CK4FEIIDSFryq4qpAy0IARUAAAAAGAElAADIQj0AgKJD8AEB-AHUCoAC-gWKAgwIABABGH8gJigcMA8=&rs=AOn4CLBVehA1aLbEfroYUDFkfPdRZIU_mg)
28. Kappa measure for Interjudge (dis)agreement for Accessing Relevance in Information Retrieval - YouTube
![Inter-Annotator Agreement (IAA). Pair-wise Cohen kappa and group Fleiss'… | by Louis de Bruijn | Towards Data Science Inter-Annotator Agreement (IAA). Pair-wise Cohen kappa and group Fleiss'… | by Louis de Bruijn | Towards Data Science](https://miro.medium.com/v2/resize:fit:1161/1*mHB6Ciljb4OnOacNWgc0aw.png)
Inter-Annotator Agreement (IAA). Pair-wise Cohen kappa and group Fleiss'… | by Louis de Bruijn | Towards Data Science
![28. Kappa measure for Interjudge (dis)agreement for Accessing Relevance in Information Retrieval - YouTube 28. Kappa measure for Interjudge (dis)agreement for Accessing Relevance in Information Retrieval - YouTube](https://i.ytimg.com/vi/uJKy7cvfc4Q/hq720.jpg?sqp=-oaymwE7CK4FEIIDSFryq4qpAy0IARUAAAAAGAElAADIQj0AgKJD8AEB-AH-CIAC0AWKAgwIABABGGUgZShlMA8=&rs=AOn4CLCseJm73ADQEXBBfzVft7pWChqvVw)
28. Kappa measure for Interjudge (dis)agreement for Accessing Relevance in Information Retrieval - YouTube
![Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters](https://www.mdpi.com/symmetry/symmetry-14-00262/article_deploy/html/images/symmetry-14-00262-g001-550.jpg)
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
![K. Gwet's Inter-Rater Reliability Blog : 2014Inter-rater reliability: Cohen kappa, Gwet AC1/AC2, Krippendorff Alpha K. Gwet's Inter-Rater Reliability Blog : 2014Inter-rater reliability: Cohen kappa, Gwet AC1/AC2, Krippendorff Alpha](https://1.bp.blogspot.com/-8lLMKISEeRo/VP2kWbXou8I/AAAAAAAAIFY/8kbySM4sPPM/s1600/altman_benchmark_scale.jpg)