楼主: jenney
6358 7

多重共线性问题 [推广有奖]

  • 0关注
  • 0粉丝

本科生

24%

还不是VIP/贵宾

-

威望
0
论坛币
526 个
通用积分
0
学术水平
0 点
热心指数
0 点
信用等级
0 点
经验
2949 点
帖子
35
精华
0
在线时间
123 小时
注册时间
2006-2-12
最后登录
2013-11-6

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币

请问:检验多重共线性时要用的本征值( eigenvalues)和病态指数(condition index)可由哪个软件得到?

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:多重共线性问题 多重共线性 多重共线 共线性 性问题 线性

回帖推荐

hanszhu 发表于8楼  查看完整内容

Multicollinearity in Structural Equation Models (SEM) Standardized regression weights: Since all the latent variables in a SEM model have been assigned a metric of 1, all the standardized regression weights should be within the range of plus or minus 1. When there is a multicollinearity problem, a weight close to 1 indicates the two variables are close to being identical. When these two nearl ...

hanszhu 发表于7楼  查看完整内容

Multicollinearity in Regression Models is an unacceptably high level of intercorrelation among the independents, such that the effects of the independents cannot be separated. Under multicollinearity, estimates are unbiased but assessments of the relative strength of the explanatory variables and their joint effect are unreliable. (That is, beta weights and R-squares cannot be interpreted reliab ...

本帖被以下文库推荐

沙发
蓝色 发表于 2006-4-2 17:29:00 |只看作者 |坛友微信交流群
sas、stata都可以的

使用道具

藤椅
awing 发表于 2006-4-2 18:38:00 |只看作者 |坛友微信交流群
if you want the freeware,R(www.r-project.org) will be your option.

使用道具

板凳
statax 发表于 2006-4-2 23:21:00 |只看作者 |坛友微信交流群

SPSS 也可以。

就是算最大的和最小的特征根罢了。

[em01][em01]
Use it, or lose it!

使用道具

报纸
jenney 发表于 2006-4-3 21:01:00 |只看作者 |坛友微信交流群

谢谢!

我不能在eviews上得到吗?

使用道具

地板
lmyshq 发表于 2006-5-18 00:10:00 |只看作者 |坛友微信交流群

对多重共线性的检验要当心。通常,这些方法只是对两个变量之间的共线性程度进行度量,多个变量的度量是很困难的。

使用道具

7
hanszhu 发表于 2006-5-18 01:01:00 |只看作者 |坛友微信交流群
  • Multicollinearity in Regression Models is an unacceptably high level of intercorrelation among the independents, such that the effects of the independents cannot be separated. Under multicollinearity, estimates are unbiased but assessments of the relative strength of the explanatory variables and their joint effect are unreliable. (That is, beta weights and R-squares cannot be interpreted reliably even though predicted values are still the best estimate using the given independents). As a rule of thumb, intercorrelation among the independents above .80 signals a possible problem. Likewise, high multicollinearity is signalled when high R-squared and signficant F tests of the model occur in combination with non-significant t-tests of coefficients.
  • That is, whereas perfect multicollinearity leads to infinite standard errors and indeterminant coefficients, the more common situation of high multicollinearity leads to large variances and covariances, large confidence intervals, and insignificant significance coefficients. Power is low (the chance of Type II errors is high - thinking you do not have a relationship when in fact one exists - failure to reject the null hypothesis that the coefficients are not different from zero). R-square is high. The coefficients and their standard errors will be sensitive to changes in just a few observations.
  • Tolerance is defined as 1 - R-squared, where R-squared is the multiple R of a given independent regressed on all other independent variables. If the tolerance value is less than some cutoff value, usually .20, the independent should be dropped from the analysis due to multicollinearity. This is better than just using simple r > .80 since tolerance looks at the independent variable in relation to all other independents and thus takes interaction effects into account as well as simple correlations. In SPSS 13, select Analyze, Regression, linear; click Statistics; check Collinearity diagnostics.

  • Variance inflation factor, VIF. Note, the variance-inflation factor, VIF, may be used in lieu of tolerance as VIF is simply the reciprocal of tolerance. The rule of thumb is that VIF > 4.0 when multicollinearity is a problem. Some authors use the more lenient cut-off of VIF >= 5 when multicollinearity is a problem. In SPSS 13, select Analyze, Regression, linear; click Statistics; check Collinearity diagnostics.

  • Condition indices. Discussed more extensively in the section on regression, condition indices over 15 indicate possible multicollinearity problems and over 30 indicate serious multicollinearity problems. In SPSS 13, select Analyze, Regression, linear; click Statistics; check Collinearity diagnostics.

[此贴子已经被作者于2006-5-18 1:05:15编辑过]

已有 1 人评分经验 论坛币 收起 理由
胖胖小龟宝 + 10 + 10 热心帮助其他会员

总评分: 经验 + 10  论坛币 + 10   查看全部评分

使用道具

8
hanszhu 发表于 2006-5-18 01:03:00 |只看作者 |坛友微信交流群

Multicollinearity in Structural Equation Models (SEM)

  • Standardized regression weights: Since all the latent variables in a SEM model have been assigned a metric of 1, all the standardized regression weights should be within the range of plus or minus 1. When there is a multicollinearity problem, a weight close to 1 indicates the two variables are close to being identical. When these two nearly identical latent variables are then used as causes of a third latent variable, the SEM method will have difficulty computing separate regression weights for the two paths from the nearly-equal variables and the third variable. As a result it may well come up with one standardized regression weight greater than +1 and one weight less than -1 for these two paths.

  • Standard errors of the unstandardized regression weights: Likewise, when there are two nearly identical latent variables, and these two are used as causes of a third latent variable, the difficulty in computing separate regression weights may well be reflected in much larger standard errors for these paths than for other paths in the model, reflecting high multicollinearity of the two nearly identical variables.

  • Covariances of the parameter estimates: Likewise, the same difficulty in computing separate regression weights may well be reflected in high covariances of the parameter estimates for these paths - estimates much higher than the covariances of parameter estimates for other paths in the model.

  • Variance estimates: Another effect of the same multicollinearity syndrome may be negative variance estimates. In the example above of two nearly-identical latent variables causing a third latent variable, the variance estimate of this third variable may be negative.

[此贴子已经被作者于2006-5-18 1:03:39编辑过]

使用道具

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群

京ICP备16021002-2号 京B2-20170662号 京公网安备 11010802022788号 论坛法律顾问:王进律师 知识产权保护声明   免责及隐私声明

GMT+8, 2024-11-6 07:39