Using Gauss for Econometrics
Carter Hill and Lee Adkins
August 30, 2001
Contents
1 Introduction 2
1.1 Using the On-line Help . . . . . . . . . . . . . . . . . . . . . . . . 2
1.1.1 Functions, Operators, and Categories . . . . . . . . . . . 3
1.1.2 Run-Time Library and User-Defined Functions . . . . . . 4
1.1.3 Item Selection . . . . . . . . . . . . . . . . . . . . . . . . 4
1.1.4 Entering Requests . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Getting Started . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.2.1 Starting and Exiting GAUSS . . . . . . . . . . . . . . . 9
1.2.2 Creating a Data File Using the GAUSS Editor . . . . . . 9
1.2.3 Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.2.4 Concatenation . . . . . . . . . . . . . . . . . . . . . . . . 11
1.2.5 Special Matrices . . . . . . . . . . . . . . . . . . . . . . . 11
1.2.6 Indexing Matrices and Extracting Submatrices . . . . . . 12
2 The Editor 14
2.1 COMMAND Mode . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.2 EDIT mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3 Operators 20
3.1 Relational Operators . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.2 Matrix Relational Operators . . . . . . . . . . . . . . . . . . . . . 21
3.3 Scalar Relational Operators . . . . . . . . . . . . . . . . . . . . . 22
3.4 Matrix Logical Operators . . . . . . . . . . . . . . . . . . . . . . 22
4 GAUSS Fundamentals 23
4.1 Precision and Rounding . . . . . . . . . . . . . . . . . . . . . . . 23
4.2 Conditional Branching . . . . . . . . . . . . . . . . . . . . . . . . 23
4.3 Unconditional Branching . . . . . . . . . . . . . . . . . . . . . . . 24
4.4 Looping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
4.5 Subroutines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.6 Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5 Linear Statistical Models 27
5.1 Introducton . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
5.2 Linear Statistical Model 1 . . . . . . . . . . . . . . . . . . . . . . 27
5.3 Linear Statistical Model 2 . . . . . . . . . . . . . . . . . . . . . . 27
5.4 The General Linear Statistical Model
Model 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.4.1 Point Estimation . . . . . . . . . . . . . . . . . . . . . . . 30
5.5 Sampling Properties of the Least Squares Rule . . . . . . . . . . 30
5.5.1 Sampling Properties–The Gauss-Markov Result . . . . . . 31
5.5.2 Estimating the Scale Parameter 2 . . . . . . . . . . . . . 31
5.5.3 Prediction and Degree of Explanation . . . . . . . . . . . 31
5.5.4 OLS Proc . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.6 A Monte Carlo Experiment to Demonstrate the Sampling Performance
of the Least Squares Estimator . . . . . . . . . . . . . . 33
6 The Normal General Linear Model 38
6.1 Maximum Likelihood Estimation . . . . . . . . . . . . . . . . . . 38
6.2 Restricted Maximum Likelihood Estimation . . . . . . . . . . . . 41
6.3 Interval Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . 42
6.3.1 Single Linear Combination of the Beta Vector . . . . . . . 42
6.3.2 Two or More Linear Combinations of the Beta Vector . . 43
6.3.3 Interval Estimation of 2 . . . . . . . . . . . . . . . . . . 44
6.3.4 Prediction Interval Estimator . . . . . . . . . . . . . . . . 44
6.4 Hypothesis Testing . . . . . . . . . . . . . . . . . . . . . . . . . . 45
6.4.1 The Likelihood Ratio Test Statistic . . . . . . . . . . . . . 45
6.4.2 A Single Hypothesis . . . . . . . . . . . . . . . . . . . . . 46
6.4.3 Testing a Hypothesis about 2 . . . . . . . . . . . . . . . 47
6.5 Summary Statement . . . . . . . . . . . . . . . . . . . . . . . . . 47
6.6 Asymptotic Properties of the Least Squares Estimator . . . . . . 48
7 Bayesian Inference: II 53
7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
7.2 A Simple Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
7.3 Bayesian Inference for the General Lineral Model with Known
Disturbance Variance . . . . . . . . . . . . . . . . . . . . . . . . . 55
7.4 An Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
7.5 Point Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
7.6 Comparing Hypotheses and Posterior Odds . . . . . . . . . . . . 60
7.7 Bayesian Inference for the General Linear Model with Unknown
Disturbance Variance . . . . . . . . . . . . . . . . . . . . . . . . . 61
8 General Linear Statistical Model 64
8.1 The Statistical Model and Estimators . . . . . . . . . . . . . . . 64
8.2 The Normal Linear Statistical Model . . . . . . . . . . . . . . . . 69
8.3 Sampling distributions of the Maximum Likelihood Estimators . 69
8.4 Interval Estimators . . . . . . . . . . . . . . . . . . . . . . . . . . 70
8.5 Hypothesis Testing . . . . . . . . . . . . . . . . . . . . . . . . . . 70
8.6 The Consequences of Using Least Squares . . . . . . . . . . . . . 70
8.7 Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
9 General Linear Model with Unknown Covariance 72
9.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
9.2 Estimated Generalized Least Squares . . . . . . . . . . . . . . . . 72
9.3 Heteroskedasticity . . . . . . . . . . . . . . . . . . . . . . . . . . 72
9.3.1 The Estimated Generalized Least Squares Estimator . . . 75
9.4 Exercises on Heteroskedasticity . . . . . . . . . . . . . . . . . . . 76
9.5 Autocorrelation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
9.6 Exact Durbin-Watson Statistic . . . . . . . . . . . . . . . . . . . 82
10 Varying Parameter Models 84
10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
10.2 Use of Dummy Variables in Estimation . . . . . . . . . . . . . . . 84
10.3 The Use of Dummy Variables to Test for a Change in the Location
Vector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
10.4 Systematically Varying Parameter Models . . . . . . . . . . . . . 87
10.5 Hildreth-Houck Random Coefficient Models . . . . . . . . . . . . 88
11 Sets of Linear Statistical Models 90
12 Estimating Nonlinear Models 106
12.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
12.2 Principles of Nonlinear Least Squares . . . . . . . . . . . . . . . . 106
12.3 Estimation of Linear Models with General Covariance Matrix . . 115
12.4 Nonlinear Seemingly Unrelated Regression Equations . . . . . . . 132
12.5 Functional Form – The Box-Cox Transformation . . . . . . . . . 135
13 Stochastic Regressors 140
13.1 Independent Stochastic Regressor Model . . . . . . . . . . . . . . 140
13.2 Partially Independent Stochastic Regressors . . . . . . . . . . . . 140
13.3 General Stochastic Regressor Models . . . . . . . . . . . . . . . . 140
13.4 Measurement Errors . . . . . . . . . . . . . . . . . . . . . . . . . 141
14 Simultaneous Linear Statistical Models: I 145
14.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
14.2 Specification of the Sampling Model . . . . . . . . . . . . . . . . 145
14.3 Least Squares Bias . . . . . . . . . . . . . . . . . . . . . . . . . . 146
14.4 The Problem of Going from the Reduced-Form Parameters to the
Structural Paramaters . . . . . . . . . . . . . . . . . . . . . . . . 147
15 Simultaneous Linear Statistical Models: II 150
16 Time-Series Analysis and Forecasting 157
16.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
17 Distributed Lags 165
17.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
17.2 Unrestricted Finite Distributed Lags . . . . . . . . . . . . . . . . 165
17.3 Finite Polynomial Lags . . . . . . . . . . . . . . . . . . . . . . . . 168
17.4 Infinite Distributed Lags . . . . . . . . . . . . . . . . . . . . . . . 170
18 Multiple-Time Series 177
18.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
18.2 Vector Autoregressive Processes . . . . . . . . . . . . . . . . . . . 177
18.3 Estimation and Specification of VAR Processes . . . . . . . . . . 178
18.4 Forecasting Vector Autoregressive Processes . . . . . . . . . . . . 181
18.5 Granger Causality . . . . . . . . . . . . . . . . . . . . . . . . . . 182
18.6 Innovation Accounting and Forecast Error Variance Decomposition183
19 Qualitative and Limited Dependent Variable Models 186
19.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
19.2 Binary Choice Models . . . . . . . . . . . . . . . . . . . . . . . . 186
19.3 Models with Limited Dependent Variables . . . . . . . . . . . . . 190
20 Biased Estimation 194
20.1 Statistical Decision Theory . . . . . . . . . . . . . . . . . . . . . 194
20.2 Combining Sample and Nonsample Information . . . . . . . . . . 194
20.3 Pretest and Stein Rule Estimators . . . . . . . . . . . . . . . . . 201
20.4 Model Specification . . . . . . . . . . . . . . . . . . . . . . . . . . 201
21 Multicollinearity 204
21.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
21.2 The Statistical Consequences of Multicollinearity . . . . . . . . . 205
21.3 Detecting the Presence, Severity, and Form of Multicollinearity . 205
21.4 Solutions to the Multicollinearity Problem . . . . . . . . . . . . . 207
22 Robust Estimation 211
22.1 The Consequences of Nonnormal Disturbances . . . . . . . . . . 211
22.2 Regression Diagnostics . . . . . . . . . . . . . . . . . . . . . . . . 211
22.3 Estimation Under Multivariate-t Errors . . . . . . . . . . . . . . 211
22.4 Estimation Using Regression Quantiles . . . . . . . . . . . . . . . 211