楼主: jiliangamelia
7015 10

ols的时候,如果只是 control variables 不是正态分布,还有必要调整至正态分布吗? [推广有奖]

  • 0关注
  • 0粉丝

本科生

98%

还不是VIP/贵宾

-

威望
0
论坛币
11 个
通用积分
0
学术水平
0 点
热心指数
0 点
信用等级
0 点
经验
256 点
帖子
70
精华
0
在线时间
116 小时
注册时间
2009-9-27
最后登录
2016-6-2

楼主
jiliangamelia 发表于 2010-2-21 09:31:26 |AI写论文
2论坛币
如题。。谢谢!!

关键词:Variables Variable control Contro contr OLS 正态分布 control Variables 至正

回帖推荐

bobguy 发表于8楼  查看完整内容

It is a BIG blunderer. Where do you see the 因变量需要正态 (need to be normally distributed) in a regression model? The error term in a regression model is only assumed its first and second moments. There is no 'shape' (normality) needed. "其实到了后来就是有只要是指数分布族就可以了" if y does not belong to exponanial family, what will it happen. Regression model will be invalid???

本帖被以下文库推荐

沙发
soccy 发表于 2010-2-21 09:42:27
不用。自变量不必服从正态分布。

藤椅
kickyras 发表于 2010-2-21 11:37:16
OLS不要求任何变量要正态分布,去看伍德里奇的<计量经济学导论>看两章你就明白OLS该怎么用了.OK.,

板凳
jiliangamelia 发表于 2010-2-21 19:16:50
soccy 发表于 2010-2-21 09:42
不用。自变量不必服从正态分布。
唉。。。。

报纸
soccy 发表于 2010-2-21 23:55:54
为什么唉声叹气?

地板
jiliangamelia 发表于 2010-2-21 23:58:09
soccy 发表于 2010-2-21 23:55
为什么唉声叹气?
找不到详细的解释的资料,总觉得雾里看花一样的。。郁闷。

7
爱萌 发表于 2010-2-27 22:41:39
呵呵, 回归的意思就是知道一些变量与另一些变量之间的关系,
这里控制变量就是自变量,因变量需要正态,其实到了后来就是有只要是指数分布族就可以了
最恨对我说谎或欺骗我的人

8
bobguy 发表于 2010-2-28 04:13:54
爱萌 发表于 2010-2-27 22:41
呵呵, 回归的意思就是知道一些变量与另一些变量之间的关系,
这里控制变量就是自变量,因变量需要正态,其实到了后来就是有只要是指数分布族就可以了
It is a BIG blunderer.

Where do you see the 因变量需要正态 (need to be normally distributed) in a regression model? The error term in a regression model is only assumed its first and second moments. There is no 'shape'  (normality) needed.  

"其实到了后来就是有只要是指数分布族就可以了" if y does not belong to exponanial family, what will it happen. Regression model will be invalid???
已有 1 人评分经验 论坛币 收起 理由
胖胖小龟宝 + 10 + 10 热心帮助其他会员

总评分: 经验 + 10  论坛币 + 10   查看全部评分

9
zj_ocean 发表于 2010-2-28 13:23:12
8# bobguy

For ols, we really do not need the error terms to be normal. But if you want to take statistical inference of betas or responses, we always assume that the error terms be normal. Because we have so many statistical tools from normal, like F-test, chi-sq test. And luckly, the Maximum Likelihood estimators (assuming that responses are normal) are same as OLS.

Generalized Linear Models (different from General Linear Models) are used to deal with exponential distribution. The coresponding methods is weighted least sqare with ML method.

Sometime, we might normalize or standardlize the covariates for the units. If your models contains covariates length, volume, speed, or weight with various units, you'd better standardlize the data.

10
bobguy 发表于 2010-2-28 21:50:12
zj_ocean 发表于 2010-2-28 13:23
8# bobguy

For ols, we really do not need the error terms to be normal. But if you want to take statistical inference of betas or responses, we always assume that the error terms be normal. Because we have so many statistical tools from normal, like F-test, chi-sq test. And luckly, the Maximum Likelihood estimators (assuming that responses are normal) are same as OLS.

Generalized Linear Models (different from General Linear Models) are used to deal with exponential distribution. The coresponding methods is weighted least sqare with ML method.

Sometime, we might normalize or standardlize the covariates for the units. If your models contains covariates length, volume, speed, or weight with various units, you'd better standardlize the data.
"But if you want to take statistical inference of betas or responses, wealways assume that the error terms be normal. Because we have so manystatistical tools from normal, like F-test, chi-sq test."   ---  Agree ! For a small sample size, the normality is necessary for statistic inference. This assumption can be relaxed under a large sample size.

"And luckly, the Maximum Likelihood estimators"  OLS is almost the same as ML under normality assumption. But variance estimation is biased and need to be adjusted by DF in ML. Both are equivelant under a large sample size. OLS was used by Gauss and others before the normal distribution was introduced by Gauss several years later. And ML was invented by Fisher many years later. I believe he got some idea from Laplace.

"Sometime, we might normalize or standardlize the covariates for theunits. If your models contains covariates length, volume, speed, orweight with various units, you'd better standardlize the data. " When you want to contrast the coefs, the normalization of the covariates makes sense. But it is not necessary.

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2025-12-26 06:46