1. Consider the topic of multi-collinearity:
a. Define and explain the problem of multi-collinearity and relate it to the expression for 𝑉𝑉𝑉(𝛽̂𝑗) and the impact on the precision of 𝛽̂𝑗. Relate your answer to an unreasonable expectation of the data in terms of identifying the ceteris paribus effects of independent variables.
b. Give an example of a 𝑘=2 multiple regression model that is likely to suffer from near multi-collinearity, and support you example.
2. Two questions about the multiple regression model:
a. List and define the six CLM assumptions and identify which of these is the critical but untestable assumption guaranteeing 𝛽̂𝑗 is unbiased.
b. Distinguish between 𝑆𝑆(𝛽̂𝑗) and 𝑆𝑆(𝛽̂𝑗) and relate the distinction to how the distribution of 𝛽􀷡𝑗−𝛽𝑗𝑆𝑆(𝛽􀷡𝑗) compares to that of 𝛽􀷡𝑗−𝛽𝑗𝑆𝑆(𝛽􀷡𝑗) under (i) finite sampling (𝑛<∞) and (ii) large sampling (𝑛→∞).
3. Consider the property of consistency.
a. If 𝑊𝑛 as a function of the sample size 𝑛 is and estimator the parameter 𝜃, define consistency and relate it to the properties of bias and efficiency. Define and explain the property of asymptotic bias of 𝑊𝑛. If 𝑊𝑛 is unbiased and consistent, describe what happens to the distribution of 𝑊𝑛 as 𝑛→∞.
b. Prove that 𝛽̂𝑗 is consistent under the Gauss-Markov assumptions for the case of 𝑘=1 (that is, in the simple regression model case).