Kappa - SPSS (part 1)

preview_player
Показать описание
I demonstrate how to perform and interpret a Kappa analysis (a.k.a., Cohen's Kappa) in SPSS. I also demonstrate the usefulness of Kappa in contrast to the more intuitive and simple approach of simply calculating the percentage of agreement between two raters.
Рекомендации по теме
Комментарии
Автор

Hi there- do you know of a technique to generate a statistic for agreement between 2 raters with multiple responses per case? e.g. each participant is rated as having a between 1-5 diseases. Much thanks for your videos. They have helped me a lot!

kellievella
Автор

Hello

If we have 6 column where column 3 is the addtion of column 1 and 2, and column 6 is addition of 4 and 5,

If we take the average k value of 1 &2, +3&4, is different from the k valur of 3 and 6.

Which one Is correct
Thanks

atakltiadhanom
Автор

Is it possible to use it if there are more than two raters?

mgme
Автор

Hi, I know its been a long time, please can we have access to download the raw data for this session? Will really love to practise with it.

gbengamogaji
Автор

I have a question. I am trying to find percentage agreement for the GAS assessment. It has 12 questions based on a 5 point scale (-2 to 2). Assessors have all assessed the same case example but there are 40 assessors. How would I input my data and what assessment should I use?
 

BGRED
Автор

What is the difference between Cohens Kappa and Fleiss Kappa?? Somebody tell me

Eva_Abdillah
Автор

If one of the scorers has given the same score for every patient is there a way to get spss to still carry out the kappa test (as it is telling me it cannot calculate the statistic as one of the values is a constant)
Also if you have 4 possible scores (eg. 0-3) but none of the patients received a score 3 will the kappa test still be accurate? Or does it need to take into account that both scorers could have scored a 3 by chance?
Thanks!

chloegarnett
Автор

Is it possible to calculate Fleiss Kappa in SPSS?

shaileshjaiswal
Автор

Merci infiniment pour ces explications !!

carreines
Автор

thanks
here the level of agreement classification:
< 0.20        Poor
0.21 - 0.40        Fair
0.41 - 0.60        Moderate
0.61 - 0.80        Good
0.81 - 1.00        Very good

Prof.Dr.FurkanErolKarabekmez