Estimating Inter-Rater Reliability with Cohen's Kappa in SPSS

preview_player
Показать описание
This video demonstrates how to estimate inter-rater reliability with Cohen’s Kappa in SPSS. Calculating sensitivity and specificity is reviewed.
Рекомендации по теме
Комментарии
Автор

Many thanks for your clear explanation on the Kappa calculation and interpretation, Prof!

ywulxqg
Автор

Fantastic explanation. Another string to my statistical bow. Thank you!

LiamDarbyshire
Автор

Thanks for your videos Dr Grande, always super helpful!

danwilson
Автор

Again like it better in SPSS than in Excel. This info helped some on specificity and sensitivity but I may have to cross-reference with the book.

greggelliott
Автор

Thanks for your video. I have a question :
What does this error indicate ? "Kappa statistic cannot be computed. It requires a two-way table in which the variables are of the same type."
Despite I utilized a two-way table in which the variables are of the same type (both of them nominal), I get that from SPSS.
What should I do about that ?

Radiology_Specific
Автор

Thank you Dr. Grande for this. I wonder what the differences are between Cohen's kappa and Cronbach's alpha? I know Cohen's kappa is good for dichotomous data whereas Cronbach's alpha can be used for continuous data, such as Likert scale rating?

bobbie
Автор

Will definitely require more study to fully understand when to apply this technique and how to interpret the results. A helpful introduction however

MarkVanderley
Автор

Hey, I have a question for ya Todd! Is there a way to estimate inter-rater reliability with Krippendorf's alpha (kalpha) in SPSS?

Thanks!

kristinrichie
Автор

Dr. Grande: what would you do if Kappa is too low? Would you ask both coders to recode the data once again but now together going through every single item in order to agree with each other and increase the alpha? What do you suggest? Thanks in advance

MrJsanabria
Автор

Many thanks for making the video, Dr. Grande! I followed the steps in SPSS and got a negative number (-.143) for Cohen's Kappa. Just wondering what a negative number means?

CH-wncr
Автор

Is it possible to use Cohen's Kappa with more than two raters?

sev