经管之家送您一份
应届毕业生专属福利!
求职就业群
感谢您参与论坛问题回答
经管之家送您两个论坛币!
+2 论坛币
<p>学过 计量学的哥哥姐姐们 你们能帮我回答下面的问题吗? 我非常的着急,中文的答案也可以,先谢谢你们了!!如果不是全有把握,能回答几个都可以,谢谢了!!!</p><p> </p><p>Group I (give a concise answer to the following questions)</p><p> </p>Suppose the Gauss-Markov assumptions hold in a multiple linear regression model. What (three) characteristics of the model can contribute to a low sampling variance of the OLS estimator of a particular coefficient? <p> </p>Give a multiple linear regression model where the Gauss-Markov assumptions hold, suppose you estimate the parameters by OLS. How would you predict the expected value of the dependent variable, given particular values of the regressors? How would you construct a prediction interval for that expected value of the dependent variable? <p> </p>Explain intuitively why weighted least squares estimations (WLS) have a smaller variance than the typical OLS estimations, in a model for cross-sectional data where the homoskedasticity assumption fails. <p> </p>Consider a multiple linear regression model for cross-sectional data that analyses the impact of trade barriers on national income of countries around the world. Among other regressors, you include dummies for Africa, Europe and Asia (that equal to if the country is, respectively, in Africa, Europe and Asia and zero otherwise).Also, you include a constant in the model. What conditions must you impose in your sample so that the Absence of Multicollinearity assumption is not violated (due to the inclusion of the dummies) in the model? <p> </p>“As it happens with cross-sectional data, we can always assume random sampling in time series analysis”. Is this statement true? Explain. <p> </p>What can go wrong in a regression model if the errors follow an AR(1) process? What can you do to solve the problem, and under what conditions can you do it? If you can’t solve the problem, how can you conduct valid inference? <p> </p>Consider the following model for time series data: <p> Yt= b0+b1*Xt1+b2*Xt2+…+bk*Xtk+ut</p><p> </p><p>Where the error term follows an AR(2) process, (ut= p1*ut-1+p2*ut-2+et and et is independent of the regressors) but all the other assumptions needed to guarantee unbiasedness of the OLS estimator and the validity of “typical” OLS inference are verified. How would you transform the model so that estimation by OLS of the transformed model is equivalent to GLS estimation of the original model for t>2 (don’t worry about the first 2 observations)?</p><p> </p>Suppose the linearity, strict exogeneity and absence of multicollinearity assumptions hold in a time series regression model that includes a linear trend as a regressor. What are the effects (in terms of bias on the estimators of the remaining regressors) of leaving the linear trend out of the model? Under what conditions is the bias inexistent or negligible? (answer the question in light of the analysis of omitted variable bias) <p> </p>Consider a multiple linear regression model for cross-sectional data where the Gauss-Markov assumptions are verified. You estimate the parameters by OLS with a sample of size n=50. Then you add 50 extra observations to your sample, obtaining a sample of size n=100. Can you guarantee that your OLS estimates obtained with the larger sample are closer to the true parameter values? Explain your answer. <p> </p>Explain how you can implement the GLS estimator if there is heteroskedasticity in the multiple linear regression model for cross-sectional data (assume you know the form of heteroskedasticity). <p> </p>In a cross-sectional context, what measure of fit would you employ to compare two linear regression models with the same dependent variable? Why? <p> </p>Why is “random walk” process non-stationary? Would you include, in general, a time series generated by such process in a regression model? Why? <p> </p> Why would you prefer to correct for serial correlation and/or heteroskedasticity in a time serial regression model, in cases where you can estimate consistently the parameters by OLS and use (robust) test statistics that are valid even in the presence of such features? <p> </p>Suppose you believe there is serial correlation in the error term of your linear regression model for time series data. You use a test aimed at detecting AR(1) serial correlation (that is valid without assuming strict exogeneity of the regressors) and you do not reject the null of absence of serial correlation of AR(1) type. Further, none of the other standard assumptions seems to be violated in your model. Is it completed safe to proceed with “typical” OLS estimation and inference, given the result of the test? Explain. <p> </p>Consider a linear regression model for time series data. Someone guarantees you that the model verifies the homoskedasticity assumption (Var[ut|Xt1,Xt2,…,Xtk]= sigma^2) , for all t, where ut is the error term and Xtj’s are the regressors. Then you look at the plot of the dependent variable and it seems its variance is not constant (it exhibits volatility clustering, consecutive periods of high volatility). Can you reconcile this with the fact that the homoskedasticity assumption is verified?
扫码加我 拉你入群
请注明:姓名-公司-职位
以便审核进群资格,未注明则拒绝
|