File:Comparison of rubrics for evaluating inter-rater kappa (and intra-class correlation) coefficients.png

Original file(1,919 × 1,061 pixels, file size: 105 KB, MIME type: image/png)

Captions

Captions

Comparison of four widely-cited recommendations for labeling levels of inter-rater agreement

Summary edit

Description
English: Kappa is a way of measuring agreement or reliability, correcting for how often ratings might agree by chance. It is similar to a correlation coefficient in that it cannot go above +1.0 or below -1.0. Several authorities have offered "rules of thumb" for interpreting the level of agreement. This figure compares several rubrics that are widely used in psychiatry and psychology.
Date
Source Own work
Author Eyoungstrom

Licensing edit

I, the copyright holder of this work, hereby publish it under the following license:
w:en:Creative Commons
attribution share alike
This file is licensed under the Creative Commons Attribution-Share Alike 4.0 International license.
You are free:
  • to share – to copy, distribute and transmit the work
  • to remix – to adapt the work
Under the following conditions:
  • attribution – You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
  • share alike – If you remix, transform, or build upon the material, you must distribute your contributions under the same or compatible license as the original.

File history

Click on a date/time to view the file as it appeared at that time.

Date/TimeThumbnailDimensionsUserComment
current13:24, 12 December 2019Thumbnail for version as of 13:24, 12 December 20191,919 × 1,061 (105 KB)Eyoungstrom (talk | contribs)User created page with UploadWizard

There are no pages that use this file.

File usage on other wikis

The following other wikis use this file:

Metadata