Monthly Archives: May 2014

Kappa Calculation With 3 Categories

In this video I show you how to calculate Kappa with Studiocode when there are 3 category choices for instances.  Basically the main difference is in the matrix of agreement options.  The 4-cell matrix when there are 2 choices becomes a 9-cell matrix (see matrix below):

Screen Shot 2014-05-30 at 11.13.24 AM

The “agreement” cells are now cells a, e, and i.


 

Attached to this post I have included a new PDF document as a reference as well as a copy of the code window with the scripts already in them.  Feel free to copy and adapt these resources to fit your research needs.  This same pattern can be used for 4 or more category Kappa calculations.  HINT:  Make the matrix first and identify the agreement cells (the downward diagonal).

New Kappa Code Window:

https://drive.google.com/file/d/0B1mdRDzIaULfY2k5T05SbEo4UWM/edit?usp=sharing

3 Group Kappa Directions:

https://drive.google.com/file/d/0B1mdRDzIaULfZk83NVZpOFBnWms/edit?usp=sharing


 

As I mentioned in the video, make sure you are thinking about sample size (frequency of code), especially with larger matrices.  I would recommend calculating Kappa after you combine timelines for multiple occasions.  For infrequent codes, there are better IRR options.

Advertisements

In my last post I explained some scripting for Kappa within Studiocode. In this video I show it in action. I describe an efficient way to have raters classify (label) coded instances so Kappa is automatically calculated.  Below are some links for you to download a sample Kappa code window (customize it with the search and replace feature) and a PDF of simple Kappa directions.

Kappa Code Window File:

https://drive.google.com/file/d/0B1mdRDzIaULfT21FQXZiOFJpdU0/edit?usp=sharing

Kappa Directions PDF:

https://drive.google.com/file/d/0B1mdRDzIaULfWmpFbk1JZkl6b1E/edit?usp=sharing

Introduction to Cohen’s Kappa for Studiocode

I bet you weren’t expecting a statistics lesson on this blog 😉 In this introductory video I discuss the purpose of Kappa as an interrater reliability (IRR) statistic for categorized instances. I also show you how to write some scripts to calculate it within your code window. The example I give in this video is for instances with only 2 category/classification options. I will show more complex calculations for more than 2 categories later. I will also post a PDF of these directions sometime in the near future so you don’t always have to go back to this video for assistance 🙂