楼主: sciencewu
3155 1

Kappa coefficient is one of the most common approaches [推广有奖]

  • 1关注
  • 2粉丝

本科生

10%

还不是VIP/贵宾

-

威望
0
论坛币
0 个
通用积分
0
学术水平
0 点
热心指数
4 点
信用等级
0 点
经验
5413 点
帖子
66
精华
0
在线时间
70 小时
注册时间
2010-5-15
最后登录
2015-7-23

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
Agreementbetween Categorical Measurements: Kappa Statistics
http://people.dbmi.columbia.edu/homepages/chuangj/kappa/

To assess the accuracy of any particular measuring'instrument', it is usual to distinguish between the reliability of the datacollected and their validity. Reliability isessentially the extent of the agreement between repeated measurements, and validity is the extent to which a method of measurement provides atrue assessment of that which it purports to measure. When studying the variability ofobserver categorical ratings, two components of possible lack of accuracy must bedistinguished. The first is inter-observer bias, which isreflected in differences in the marginal distributions of the response variable for eachof the observers (Cochran's Q-test is the appropriate testfor the hypothesis of no inter-observer bias). The second is observerdisagreement, which is indicated by how observers classify individual subjects intothe same category on the measurement scale (Kappa coefficientis one of the most common approaches). In this part, we will focus on the Kappacoefficient (or Kappa statistics).
Kappa Statistics: an index whichcompares the agreement against that which might be expected by chance. Kappa can bethought of as the chance-corrected proportional agreement, and possible values range from+1 (perfect agreement) via 0 (no agreement above that expected by chance) to -1 (completedisagreement).

Hypothetical Example: 29 patients are examined by two independent doctors(see Table). 'Yes' denotes the patient is diagnosed with disease X by a doctor. 'No'denotes the patient is classified as no disease X by a doctor.


Doctor A


No

Yes

Total

Doctor B

No

10    (34.5%)

7 (24.1%)

17 (58.6%)

Yes

0 (0.0%)

12    (41.4%)

12 (41.4%)

Total

10 (34.5%)

19 (65.5%)

29

Kappa = (Observed agreement - Chance agreement)/(1 -Chance agreement)
Observed agreement = (10 + 12)/29= 0.76
Chance agreement = 0.586 * 0.345+ 0.655 * 0.414 = 0.474
Kappa = (0.76 - 0.474)/(1 - 0.474) = 0.54

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:coefficient APPROACHES EFFICIENT Approach Common APPROACHES Common one kappa coefficient

沙发
sciencewu 发表于 2010-5-23 16:13:48 |只看作者 |坛友微信交流群
1# sciencewu

有更加详细的介绍的请跟帖。

使用道具

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注cda
拉您进交流群

京ICP备16021002-2号 京B2-20170662号 京公网安备 11010802022788号 论坛法律顾问:王进律师 知识产权保护声明   免责及隐私声明

GMT+8, 2024-4-27 19:01