楼主: roncwm_massey
3336 16

请斑竹帮助我解答,可能这里没有人会这个个问题 [推广有奖]

11
zhaosweden 发表于 2006-3-11 21:54:00
(8)

The best explanation I've heard for understanding degrees of freedom in various statistical calculations is as follows:

Degrees of freedom of n - 1 is required when taking a sample from a population because when taking the limited size sample, you have only a very slight chance of picking the extreme data values of the population. In order to accomodate for this, you subtract the value of 1 (n - 1) in order to inflate the std deviation and make the standard deviation calulation more adequately represent the parent population.

Notice that in the calculation of the population std deviation, the sum of squares is divided by the value N and not n - 1. This is because all values in the poplulation (including extreme values) are taken into account.

(9)

I have an excellent BB Coach that is able to explain things to me so that I can understand them. The way he explained Degrees of Freedom to me is as follows: You are given 4 checkers and you are told to place a checker in each corner of the board. For the first placement you have 4 choices (degrees of freedom); for the second you have 3, for the third placement you only have 2 degrees of freedom and for the 4th you have none so this works out. n-1 if n is 4, then you only have 3 degrees of freedom.

12
zhaosweden 发表于 2006-3-11 21:57:00
(10)

My BB coach taught the same lucid way with a different example. It is also important to note that the neccesity to subtract 1 from n becomes negligible as n increases in size where n > 30.

(zhaosweden: of course, when sample size increase, loss in df in not important, but when sample size is small, we prefer simple model, such that we can estimates things more accurately.)

13
zhaosweden 发表于 2006-3-11 21:58:00

In what follows I give some definition-like stuff:

(a)

Statisticians use the terms "degrees of freedom" to describe the number of values in the final calculation of a statistic that are free to vary. Consider, for example the statistic s-square.

To calculate the s-square of a random sample, we must first calculate the mean of that sample and then compute the sum of the several squared deviations from that mean. While there will be n such squared deviations only (n - 1) of them are, in fact, free to assume any value whatsoever. This is because the final squared deviation from the mean must include the one value of X such that the sum of all the Xs divided by n will equal the obtained mean of the sample. All of the other (n - 1) squared deviations from the mean can, theoretically, have any values whatsoever. For these reasons, the statistic s-square is said to have only (n - 1) degrees of freedom.

[此贴子已经被作者于2006-3-12 3:30:02编辑过]

14
zhaosweden 发表于 2006-3-11 21:58:00
(b)

Estimates of parameters can be based upon different amounts of information. The number of independent pieces of information that go into the estimate of a parameter is called the degrees of freedom (df). In general, the degrees of freedom of an estimate is equal to the number of independent scores that go into the estimate minus the number of parameters estimated as intermediate steps in the estimation of the parameter itself. For example, if the variance, σ2 , is to be estimated from a random sample of N independent scores, then the degrees of freedom is equal to the number of independent scores (N) minus the number of parameters estimated as intermediate steps (one, μ is estimated by M) and is therefore equal to N-1.

  1. &lt;SCRIPT&gt;<br>
  2. bottom1()
  3. <br>&lt;/script&gt;
复制代码

15
zhaosweden 发表于 2006-3-11 22:01:00

(c)

DEGREES OF FREEDOM EXPLANATION

Sampling and Statistics

When the population standard deviation ( ) is unknown and the sample size is less than 30 (n < 30), the distribution of the test statistic can not be guaranteed to be normal. In fact, the test statistic can be said to conform to what is called a t distribution.

The t distribution is similar to the standard normal distribution in that it is symmetrically distributed around a mean value. But where the t distribution varies from the standard normal, is that its standard deviation is determined by what is known as the number of degrees of freedom.

Degrees of freedom are calculated from the size of the sample. They are a measure of the amount of information from the sample data that has been used up. Every time a statistic is calculated from a sample, one degree of freedom is used up.

When you have a very large sample drawn from your data set, the difference between t values and Z values is miniscule. But as the size of your sample falls, the t distribution takes on a standard deviation increasingly greater than 1. In other words, this means that the t distribution when n < 30, is more spread out than the standard normal distribution.

16
zhaosweden 发表于 2006-3-11 22:02:00

OK, let me stop here. In above, I made my own comment and pasted some internet materials. I hopw when you are patient enough to go through them, you must have had a better understanding of "Degrees of freedom" such that you now have more degrees of freedom in Econometrics!

This concept can always be confusing to entry level students. There are also degrees of freedom concept in Physics and Chemistry, If you are interest, you can type such keywords in google.

Moreover, the zipped file is a Word DOC file of some website explanation, I think it is very nice.

43084.rar (54.59 KB) 本附件包括:
  • Degrees of freedom1.doc

[此贴子已经被作者于2006-3-11 22:12:36编辑过]

17
leewinjing 发表于 2006-3-12 01:22:00
佩服,回答地生动详细!!厉害呀
DEA软件及资料资源环境经济

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2026-1-15 20:33