Kappa Coefficient

preview_player
Показать описание
Cohen's kappa coefficient is a statistical measure of inter-rater agreement for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation since κ takes into account the agreement occurring by chance
Рекомендации по теме
Комментарии
Автор

no he entendido un carajo de lo que haz dicho (no entiendo el inglés) pero estos 4 minutos han sido mejores que la hora de mi profesor. Muchas gracias.

Teksuyi
Автор

So helpful, watching a 4.5 min video sure beats a 50 minute lecture

chenshilongsun
Автор

Simple, very well explained, nicely presented, clear voice. Excellent, thank you so much, this video is very useful.

fernandoduartemolina
Автор

This is pretty quick and effective it seems.

Understanding the formula and how it works in depth surely takes more than 5 minutes, but it sure saves some work lmao

Thank you for this

heikochujikyo
Автор

Amazing explanation! I don't understand why people who make good change get less followers.

knowingtruthisbliss
Автор

this was VERY helpful and simplified the concept.. thank you. please do more videos !

ezzrabella
Автор

What I though its not possible to understand, needed only a great 4 minute video to understand. Thanks a lot!

ehssan
Автор

The explanation is quite clear, the numbers can be a bit optimized. Agreement: 63.1578947368421%,  
Cohen’s k: 0.10738255033557026. Thanks for the video!

arnoudvanrooij
Автор

Great explanation (with nice sketches as a bonus)- thank you!

rafa_leo_siempre
Автор

great concise explanation thank you. I will be passing this on

jaminv
Автор

The only thing I do not understand is the "Chance Agreements", the AC calculation of .58. I understand where the numbers come from, but I do not understand the theory behind why the arithmetic works to give us this concept of "chance" agreement. All of the numbers in the table are what was observed to have happened...how can we just take some of the values in the table and call it "chance" agreement? Where is the actual proof they agreed by chance in .58 of the cases?

EvaSlash
Автор

Still not clear how come to the final Kappa equation? Why (OA-AC)? Why divided by (1-AC)? The rationale is obscure to me.

genwei
Автор

can you please eloborate on the meaning of a high or low Kappa value? I can now calculate kappa, but what does it mean?

simonchan
Автор

just awesome !!! THANKS . Plz make more such video regarding biostatistics.

bhushankamble
Автор

accurate, sharp and on the point. thank you sir! :)

gokhancam
Автор

How to calculate the agreement between "strongly disagree, disagree, agree, strongly disagree", what is the formula only to calculate 'observed agreement'

MrThesyeight
Автор

Thank you so much! Problem is, I don't have a "YES" or "NO" answer from each rater. I have a grade of 1-5 given by each rater. Can I still calculate Kappa?

KnightMD
Автор

Question please:

We are supposed to do kappa scoring for dentistry but we have 5 graders. How do we do such thing?

danjosh
Автор

Thank you. If I have data with high agreement between both observers, should i choose the results of any one of the raters or should i use the mean of rating of both?

daliael-rouby
Автор

what is the range of kappa values that indicate good agreement and low agreement?

zicodgra
join shbcf.ru