请选择 进入手机版 | 继续访问电脑版
楼主: weihancool
3661 3

[原创博文] Linear Models and Generalizations Least Squares and Alternatives [推广有奖]

  • 6关注
  • 2粉丝

学科带头人

88%

还不是VIP/贵宾

-

威望
1
论坛币
12979 个
通用积分
17.0384
学术水平
131 点
热心指数
153 点
信用等级
101 点
经验
69380 点
帖子
3077
精华
0
在线时间
1735 小时
注册时间
2007-2-21
最后登录
2024-3-16

初级热心勋章

weihancool 发表于 2008-1-29 13:00:00 |显示全部楼层 |坛友微信交流群

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
1 Introduction 1<br/>1.1 Linear Models and Regression Analysis . . . . . . . . . . 1<br/>1.2 Plan of the Book . . . . . . . . . . . . . . . . . . . . . . 3<br/>2 The Simple Linear Regression Model 7<br/>2.1 The LinearModel . . . . . . . . . . . . . . . . . . . . . . 7<br/>2.2 Least Squares Estimation . . . . . . . . . . . . . . . . . . 8<br/>2.3 Direct RegressionMethod . . . . . . . . . . . . . . . . . 10<br/>2.4 Properties of the Direct Regression Estimators . . . . . . 12<br/>2.5 CenteredModel . . . . . . . . . . . . . . . . . . . . . . . 14<br/>2.6 No Intercept Term Model . . . . . . . . . . . . . . . . . . 15<br/>2.7 Maximum Likelihood Estimation . . . . . . . . . . . . . . 15<br/>2.8 Testing of Hypotheses and Confidence Interval Estimation 17<br/>2.9 Analysis of Variance . . . . . . . . . . . . . . . . . . . . . 20<br/>2.10 Goodness of Fit of Regression . . . . . . . . . . . . . . . 23<br/>2.11 Reverse RegressionMethod . . . . . . . . . . . . . . . . . 24<br/>2.12 Orthogonal Regression Method . . . . . . . . . . . . . . . 24<br/>2.13 ReducedMajor Axis RegressionMethod . . . . . . . . . 27<br/>xii Contents<br/>2.14 Least Absolute Deviation Regression Method . . . . . . . 29<br/>2.15 Estimation of Parameters when X Is Stochastic . . . . . 30<br/>3 TheMultiple Linear RegressionModel and Its Extensions 33<br/>3.1 The LinearModel . . . . . . . . . . . . . . . . . . . . . . 33<br/>3.2 The Principle of Ordinary Least Squares (OLS) . . . . . 35<br/>3.3 Geometric Properties of OLS . . . . . . . . . . . . . . . . 36<br/>3.4 Best Linear Unbiased Estimation . . . . . . . . . . . . . 38<br/>3.4.1 Basic Theorems . . . . . . . . . . . . . . . . . . . 38<br/>3.4.2 Linear Estimators . . . . . . . . . . . . . . . . . 43<br/>3.4.3 Mean Dispersion Error . . . . . . . . . . . . . . . 44<br/>3.5 Estimation (Prediction) of the Error Term ǫ and σ2 . . . 45<br/>3.6 Classical Regression under Normal Errors . . . . . . . . . 46<br/>3.6.1 The Maximum-Likelihood (ML) Principle . . . . 47<br/>3.6.2 Maximum Likelihood Estimation in Classical<br/>Normal Regression . . . . . . . . . . . . . . . . . 47<br/>3.7 Consistency of Estimators . . . . . . . . . . . . . . . . . 49<br/>3.8 Testing Linear Hypotheses . . . . . . . . . . . . . . . . . 51<br/>3.9 Analysis of Variance . . . . . . . . . . . . . . . . . . . . . 57<br/>3.10 Goodness of Fit . . . . . . . . . . . . . . . . . . . . . . . 59<br/>3.11 Checking the Adequacy of Regression Analysis . . . . . . 61<br/>3.11.1 Univariate Regression . . . . . . . . . . . . . . . 61<br/>3.11.2 Multiple Regression . . . . . . . . . . . . . . . . 61<br/>3.11.3 A Complex Example . . . . . . . . . . . . . . . . 65<br/>3.11.4 Graphical Presentation . . . . . . . . . . . . . . 69<br/>3.12 Linear Regression with Stochastic Regressors . . . . . . 70<br/>3.12.1 Regression and Multiple Correlation Coefficient . 70<br/>3.12.2 Heterogenous Linear Estimation without<br/>Normality . . . . . . . . . . . . . . . . . . . . . . 72<br/>3.12.3 Heterogeneous Linear Estimation under<br/>Normality . . . . . . . . . . . . . . . . . . . . . . 73<br/>3.13 The Canonical Form. . . . . . . . . . . . . . . . . . . . . 76<br/>3.14 Identification and Quantification of Multicollinearity . . 77<br/>3.14.1 Principal Components Regression . . . . . . . . . 77<br/>3.14.2 Ridge Estimation . . . . . . . . . . . . . . . . . . 79<br/>3.14.3 Shrinkage Estimates . . . . . . . . . . . . . . . . 83<br/>3.14.4 Partial Least Squares . . . . . . . . . . . . . . . 84<br/>3.15 Tests of Parameter Constancy . . . . . . . . . . . . . . . 87<br/>3.15.1 The Chow Forecast Test . . . . . . . . . . . . . . 88<br/>3.15.2 The Hansen Test . . . . . . . . . . . . . . . . . . 91<br/>3.15.3 Tests with Recursive Estimation . . . . . . . . . 92<br/>3.15.4 Test for Structural Change . . . . . . . . . . . . 93<br/>3.16 Total Least Squares . . . . . . . . . . . . . . . . . . . . . 96<br/>3.17 Minimax Estimation. . . . . . . . . . . . . . . . . . . . . 98<br/>3.17.1 Inequality Restrictions . . . . . . . . . . . . . . . 98<br/>Contents xiii<br/>3.17.2 The Minimax Principle . . . . . . . . . . . . . . 101<br/>3.18 Censored Regression . . . . . . . . . . . . . . . . . . . . . 105<br/>3.18.1 Overview . . . . . . . . . . . . . . . . . . . . . . 105<br/>3.18.2 LAD Estimators and Asymptotic Normality . . . 107<br/>3.18.3 Tests of Linear Hypotheses . . . . . . . . . . . . 108<br/>3.19 Simultaneous Confidence Intervals . . . . . . . . . . . . . 110<br/>3.20 Confidence Interval for the Ratio of Two Linear<br/>Parametric Functions . . . . . . . . . . . . . . . . . . . . 112<br/>3.21 Nonparametric Regression . . . . . . . . . . . . . . . . . 112<br/>3.21.1 Estimation of the Regression Function . . . . . . 114<br/>3.22 Classification and Regression Trees (CART) . . . . . . . 117<br/>3.23 Boosting and Bagging . . . . . . . . . . . . . . . . . . . . 121<br/>3.24 Projection Pursuit Regression . . . . . . . . . . . . . . . 124<br/>3.25 Neural Networks and Nonparametric Regression . . . . . 126<br/>3.26 Logistic Regression and Neural Networks . . . . . . . . . 127<br/>3.27 Functional Data Analysis (FDA) . . . . . . . . . . . . . . 127<br/>3.28 Restricted Regression . . . . . . . . . . . . . . . . . . . . 130<br/>3.28.1 Problem of Selection . . . . . . . . . . . . . . . . 130<br/>3.28.2 Theory of Restricted Regression . . . . . . . . . 130<br/>3.28.3 Efficiency of Selection . . . . . . . . . . . . . . . 132<br/>3.28.4 Explicit Solution in Special Cases . . . . . . . . . 133<br/>3.29 LINEX Loss Function . . . . . . . . . . . . . . . . . . . . 135<br/>3.30 Balanced Loss Function . . . . . . . . . . . . . . . . . . . 137<br/>3.31 Complements . . . . . . . . . . . . . . . . . . . . . . . . 138<br/>3.31.1 Linear Models without Moments: Exercise . . . . 138<br/>3.31.2 Nonlinear Improvement of OLSE for<br/>Nonnormal Disturbances . . . . . . . . . . . . . . 139<br/>3.31.3 A Characterization of the Least Squares<br/>Estimator . . . . . . . . . . . . . . . . . . . . . . 139<br/>3.31.4 A Characterization of the Least Squares<br/>Estimator: A Lemma . . . . . . . . . . . . . . . . 140<br/>3.32 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 140<br/>4 The Generalized Linear Regression Model 143<br/>4.1 Optimal Linear Estimation of β . . . . . . . . . . . . . . 144<br/>4.1.1 R1-Optimal Estimators . . . . . . . . . . . . . . 145<br/>4.1.2 R2-Optimal Estimators . . . . . . . . . . . . . . 149<br/>4.1.3 R3-Optimal Estimators . . . . . . . . . . . . . . 150<br/>4.2 The Aitken Estimator . . . . . . . . . . . . . . . . . . . . 151<br/>4.3 Misspecification of the Dispersion Matrix . . . . . . . . . 153<br/>4.4 Heteroscedasticity and Autoregression . . . . . . . . . . . 156<br/>4.5 Mixed Effects Model: Unified Theory of Linear Estimation 164<br/>4.5.1 Mixed EffectsModel . . . . . . . . . . . . . . . . 164<br/>4.5.2 A Basic Lemma . . . . . . . . . . . . . . . . . . 164<br/>4.5.3 Estimation of Xβ (the Fixed Effect) . . . . . . . 166<br/>xiv Contents<br/>4.5.4 Prediction of Uξ (the Random Effect) . . . . . . 166<br/>4.5.5 Estimation of ǫ . . . . . . . . . . . . . . . . . . . 167<br/>4.6 Linear Mixed Models with Normal Errors and Random<br/>Effects . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168<br/>4.6.1 Maximum Likelihood Estimation of Linear Mixed<br/>Models . . . . . . . . . . . . . . . . . . . . . . . 171<br/>4.6.2 Restricted Maximum Likelihood Estimation of<br/>LinearMixedModels . . . . . . . . . . . . . . . . 174<br/>4.6.3 Inference for Linear Mixed Models . . . . . . . . 178<br/>4.7 Regression-Like Equations in Econometrics . . . . . . . . 183<br/>4.7.1 EconometricModels . . . . . . . . . . . . . . . . 186<br/>4.7.2 The Reduced Form . . . . . . . . . . . . . . . . . 190<br/>4.7.3 The Multivariate Regression Model . . . . . . . . 192<br/>4.7.4 The ClassicalMultivariate Linear RegressionModel 195<br/>4.7.5 Stochastic Regression . . . . . . . . . . . . . . . 196<br/>4.7.6 Instrumental Variable Estimator . . . . . . . . . 197<br/>4.7.7 Seemingly Unrelated Regressions . . . . . . . . . 198<br/>4.7.8 Measurement ErrorModels . . . . . . . . . . . . 199<br/>4.8 Simultaneous Parameter Estimation by Empirical Bayes<br/>Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . 209<br/>4.8.1 Overview . . . . . . . . . . . . . . . . . . . . . . 209<br/>4.8.2 Estimation of Parameters from Different Linear<br/>Models . . . . . . . . . . . . . . . . . . . . . . . 211<br/>4.9 Supplements . . . . . . . . . . . . . . . . . . . . . . . . . 215<br/>4.10 Gauss-Markov, Aitken and Rao Least Squares Estimators 216<br/>4.10.1 Gauss-Markov Least Squares . . . . . . . . . . . 216<br/>4.10.2 Aitken Least Squares . . . . . . . . . . . . . . . . 217<br/>4.10.3 Rao Least Squares . . . . . . . . . . . . . . . . . 218<br/>4.11 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 220<br/>5 Exact and Stochastic Linear Restrictions 223<br/>5.1 Use of Prior Information . . . . . . . . . . . . . . . . . . 223<br/>5.2 The Restricted Least-Squares Estimator . . . . . . . . . 225<br/>5.3 Maximum Likelihood Estimation under Exact Restrictions 227<br/>5.4 Stepwise Inclusion of Exact Linear Restrictions . . . . . . 228<br/>5.5 Biased Linear Restrictions and MDE Comparison with the<br/>OLSE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233<br/>5.6 MDE Matrix Comparisons of Two Biased Estimators . . 236<br/>5.7 MDE Matrix Comparison of Two Linear Biased<br/>Estimators . . . . . . . . . . . . . . . . . . . . . . . . . . 242<br/>5.8 MDE Comparison of Two (Biased) Restricted Estimators 243<br/>5.9 Stein-Rule Estimators under Exact Restrictions . . . . . 251<br/>5.10 Stochastic Linear Restrictions . . . . . . . . . . . . . . . 252<br/>5.10.1 Mixed Estimator . . . . . . . . . . . . . . . . . . 252<br/>5.10.2 Assumptions about the Dispersion Matrix . . . . 254<br/>Contents xv<br/>5.10.3 Biased Stochastic Restrictions . . . . . . . . . . . 257<br/>5.11 Stein-Rule Estimators under Stochastic Restrictions . . . 261<br/>5.12 Weakened Linear Restrictions . . . . . . . . . . . . . . . 262<br/>5.12.1 Weakly (R, r)-Unbiasedness . . . . . . . . . . . . 262<br/>5.12.2 Optimal Weakly (R, r)-Unbiased Estimators . . . 262<br/>5.12.3 Feasible Estimators—Optimal Substitution of β in<br/>ˆ β1(β,A) . . . . . . . . . . . . . . . . . . . . . . . 266<br/>5.12.4 RLSE instead of the Mixed Estimator . . . . . . 268<br/>5.13 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 269<br/>6 Prediction in the Generalized Regression Model 271<br/>6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 271<br/>6.2 Some Simple Linear Models . . . . . . . . . . . . . . . . 271<br/>6.2.1 The Constant Mean Model . . . . . . . . . . . . 271<br/>6.2.2 The Linear Trend Model . . . . . . . . . . . . . . 272<br/>6.2.3 Polynomial Models . . . . . . . . . . . . . . . . . 273<br/>6.3 The PredictionModel . . . . . . . . . . . . . . . . . . . . 274<br/>6.4 Optimal Heterogeneous Prediction . . . . . . . . . . . . . 275<br/>6.5 Optimal Homogeneous Prediction . . . . . . . . . . . . . 277<br/>6.6 MDE Matrix Comparisons between Optimal and Classical<br/>Predictors . . . . . . . . . . . . . . . . . . . . . . . . . . 280<br/>6.6.1 Comparison of Classical and Optimal<br/>Prediction with Respect to the y∗ Superiority . . 283<br/>6.6.2 Comparison of Classical and Optimal<br/>Predictors with Respect to the X∗β Superiority . 285<br/>6.7 Prediction Regions . . . . . . . . . . . . . . . . . . . . . 287<br/>6.7.1 Concepts and Definitions . . . . . . . . . . . . . 287<br/>6.7.2 On q-Prediction Intervals . . . . . . . . . . . . . 289<br/>6.7.3 On q-Intervals in Regression Analysis . . . . . . 291<br/>6.7.4 On (p, q)-Prediction Intervals . . . . . . . . . . . 292<br/>6.7.5 Linear Utility Functions . . . . . . . . . . . . . . 294<br/>6.7.6 Normally Distributed Populations - Two-Sided<br/>Symmetric Intervals . . . . . . . . . . . . . . . . 296<br/>6.7.7 Onesided Infinite Intervals . . . . . . . . . . . . . 298<br/>6.7.8 Utility and Length of Intervals . . . . . . . . . . 298<br/>6.7.9 Utility and coverage . . . . . . . . . . . . . . . . 300<br/>6.7.10 Maximal Utility and Optimal Tests . . . . . . . . 300<br/>6.7.11 Prediction Ellipsoids Based on the GLSE . . . . 302<br/>6.7.12 Comparing the Efficiency of Prediction Ellipsoids 305<br/>6.8 Simultaneous Prediction of Actual and Average Values of y 306<br/>6.8.1 Specification of Target Function . . . . . . . . . 307<br/>6.8.2 Exact Linear Restrictions . . . . . . . . . . . . . 308<br/>6.8.3 MDEP Using Ordinary Least Squares Estimator 309<br/>6.8.4 MDEP Using Restricted Estimator . . . . . . . . 309<br/>6.8.5 MDEP Matrix Comparison . . . . . . . . . . . . 310<br/>xvi Contents<br/>6.8.6 Stein-Rule Predictor . . . . . . . . . . . . . . . . 310<br/>6.8.7 Outside Sample Predictions . . . . . . . . . . . . 311<br/>6.9 Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . 314<br/>6.9.1 Dynamical and Observational Equations . . . . . 314<br/>6.9.2 Some Theorems . . . . . . . . . . . . . . . . . . . 314<br/>6.9.3 KalmanModel . . . . . . . . . . . . . . . . . . . 317<br/>6.10 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 318<br/>7 Sensitivity Analysis 321<br/>7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 321<br/>7.2 PredictionMatrix . . . . . . . . . . . . . . . . . . . . . . 321<br/>7.3 Effect of Single Observation on Estimation of Parameters 327<br/>7.3.1 Measures Based on Residuals . . . . . . . . . . . 328<br/>7.3.2 Algebraic Consequences of Omitting<br/>an Observation . . . . . . . . . . . . . . . . . . . 329<br/>7.3.3 Detection of Outliers . . . . . . . . . . . . . . . . 330<br/>7.4 Diagnostic Plots for Testing the Model Assumptions . . . 334<br/>7.5 Measures Based on the Confidence Ellipsoid . . . . . . . 335<br/>7.6 Partial Regression Plots . . . . . . . . . . . . . . . . . . . 341<br/>7.7 Regression Diagnostics for Removing an Observation with<br/>Graphics . . . . . . . . . . . . . . . . . . . . . . . . . . . 343<br/>7.8 Model Selection Criteria . . . . . . . . . . . . . . . . . . 350<br/>7.8.1 Akaikes Information Criterion . . . . . . . . . . . 351<br/>7.8.2 Bayesian Information Criterion . . . . . . . . . . 353<br/>7.8.3 Mallows Cp . . . . . . . . . . . . . . . . . . . . . 353<br/>7.8.4 Example . . . . . . . . . . . . . . . . . . . . . . . 355<br/>7.9 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 356<br/>8 Analysis of Incomplete Data Sets 357<br/>8.1 Statistical Methods with Missing Data . . . . . . . . . . 358<br/>8.1.1 Complete Case Analysis . . . . . . . . . . . . . . 358<br/>8.1.2 Available CaseAnalysis . . . . . . . . . . . . . . 358<br/>8.1.3 Filling in the Missing Values . . . . . . . . . . . 359<br/>8.1.4 Model-Based Procedures . . . . . . . . . . . . . . 359<br/>8.2 Missing-Data Mechanisms . . . . . . . . . . . . . . . . . 360<br/>8.2.1 Missing Indicator Matrix . . . . . . . . . . . . . 360<br/>8.2.2 Missing Completely at Random . . . . . . . . . . 360<br/>8.2.3 Missing at Random . . . . . . . . . . . . . . . . 360<br/>8.2.4 Nonignorable Nonresponse . . . . . . . . . . . . 360<br/>8.3 Missing Pattern . . . . . . . . . . . . . . . . . . . . . . . 360<br/>8.4 Missing Data in the Response . . . . . . . . . . . . . . . 361<br/>8.4.1 Least-Squares Analysis for Filled-up<br/>Data—Yates Procedure . . . . . . . . . . . . . . 362<br/>8.4.2 Analysis of Covariance—Bartlett’s Method . . . 363<br/>8.5 Shrinkage Estimation by Yates Procedure . . . . . . . . . 364<br/>Contents xvii<br/>8.5.1 Shrinkage Estimators . . . . . . . . . . . . . . . 364<br/>8.5.2 Efficiency Properties . . . . . . . . . . . . . . . . 365<br/>8.6 Missing Values in the X-Matrix . . . . . . . . . . . . . . 367<br/>8.6.1 GeneralModel . . . . . . . . . . . . . . . . . . . 367<br/>8.6.2 Missing Values and Loss in Efficiency . . . . . . 368<br/>8.7 Methods for Incomplete X-Matrices . . . . . . . . . . . . 371<br/>8.7.1 Complete Case Analysis . . . . . . . . . . . . . . 371<br/>8.7.2 Available CaseAnalysis . . . . . . . . . . . . . . 371<br/>8.7.3 Maximum-Likelihood Methods . . . . . . . . . . 372<br/>8.8 Imputation Methods for Incomplete X-Matrices . . . . . 373<br/>8.8.1 Maximum-Likelihood Estimates of Missing<br/>Values . . . . . . . . . . . . . . . . . . . . . . . . 373<br/>8.8.2 Zero-Order Regression . . . . . . . . . . . . . . . 374<br/>8.8.3 First-Order Regression . . . . . . . . . . . . . . . 375<br/>8.8.4 Multiple Imputation . . . . . . . . . . . . . . . . 377<br/>8.8.5 Weighted Mixed Regression . . . . . . . . . . . . 378<br/>8.8.6 The Two-Stage WMRE . . . . . . . . . . . . . . 382<br/>8.9 Assumptions about the Missing Mechanism . . . . . . . . 384<br/>8.10 Regression Diagnostics to Identify Non-MCAR Processes 384<br/>8.10.1 Comparison of theMeans . . . . . . . . . . . . . 384<br/>8.10.2 Comparing the Variance-Covariance Matrices . . 385<br/>8.10.3 Diagnostic Measures from Sensitivity Analysis . 385<br/>8.10.4 Distribution of the Measures and Test<br/>Procedure . . . . . . . . . . . . . . . . . . . . . . 385<br/>8.11 Treatment of Nonignorable Nonresponse . . . . . . . . . 386<br/>8.11.1 Joint Distribution of (X, Y ) with Missing<br/>Values Only in Y . . . . . . . . . . . . . . . . . . 386<br/>8.11.2 Conditional Distribution of Y Given X with<br/>Missing Values Only in Y . . . . . . . . . . . . . 388<br/>8.11.3 Conditional Distribution of Y Given X with<br/>Missing Values Only in X . . . . . . . . . . . . . 389<br/>8.11.4 Other Approaches . . . . . . . . . . . . . . . . . 390<br/>8.12 Further Literature . . . . . . . . . . . . . . . . . . . . . . 391<br/>8.13 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 391<br/>9 Robust Regression 393<br/>9.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . 393<br/>9.2 Least Absolute Deviation Estimators — Univariate Case 394<br/>9.3 M-Estimates: Univariate Case . . . . . . . . . . . . . . . 398<br/>9.4 Asymptotic Distributions of LAD Estimators . . . . . . . 401<br/>9.4.1 Univariate Case . . . . . . . . . . . . . . . . . . . 401<br/>9.4.2 Multivariate Case . . . . . . . . . . . . . . . . . 402<br/>9.5 GeneralM-Estimates . . . . . . . . . . . . . . . . . . . . 403<br/>9.6 Tests of Significance . . . . . . . . . . . . . . . . . . . . . 407<br/>xviii Contents<br/>10 Models for Categorical Response Variables 411<br/>10.1 Generalized LinearModels . . . . . . . . . . . . . . . . . 411<br/>10.1.1 Extension of the Regression Model . . . . . . . . 411<br/>10.1.2 Structure of the Generalized Linear Model . . . . 413<br/>10.1.3 Score Function and Information Matrix . . . . . 416<br/>10.1.4 Maximum-Likelihood Estimation . . . . . . . . . 417<br/>10.1.5 Testing of Hypotheses and Goodness of Fit . . . 420<br/>10.1.6 Overdispersion . . . . . . . . . . . . . . . . . . . 421<br/>10.1.7 Quasi Loglikelihood . . . . . . . . . . . . . . . . 423<br/>10.2 Contingency Tables . . . . . . . . . . . . . . . . . . . . . 425<br/>10.2.1 Overview . . . . . . . . . . . . . . . . . . . . . . 425<br/>10.2.2 Ways of Comparing Proportions . . . . . . . . . 427<br/>10.2.3 Sampling in Two-Way Contingency Tables . . . 429<br/>10.2.4 Likelihood Function and Maximum-Likelihood<br/>Estimates . . . . . . . . . . . . . . . . . . . . . . 430<br/>10.2.5 Testing the Goodness of Fit . . . . . . . . . . . . 432<br/>10.3 GLMfor Binary Response . . . . . . . . . . . . . . . . . 435<br/>10.3.1 Logit Models and Logistic Regression . . . . . . 435<br/>10.3.2 Testing theModel . . . . . . . . . . . . . . . . . 437<br/>10.3.3 Distribution Function as a Link Function . . . . 438<br/>10.4 Logit Models for Categorical Data . . . . . . . . . . . . . 439<br/>10.5 Goodness of Fit—Likelihood-Ratio Test . . . . . . . . . . 440<br/>10.6 Loglinear Models for Categorical Variables . . . . . . . . 441<br/>10.6.1 Two-Way Contingency Tables . . . . . . . . . . . 441<br/>10.6.2 Three-Way Contingency Tables . . . . . . . . . . 444<br/>10.7 The Special Case of Binary Response . . . . . . . . . . . 448<br/>10.8 Coding of Categorical Explanatory Variables . . . . . . . 450<br/>10.8.1 Dummy and Effect Coding . . . . . . . . . . . . 450<br/>10.8.2 Coding of ResponseModels . . . . . . . . . . . . 453<br/>10.8.3 Coding of Models for the Hazard Rate . . . . . . 455<br/>10.9 Extensions to Dependent Binary Variables . . . . . . . . 457<br/>10.9.1 Overview . . . . . . . . . . . . . . . . . . . . . . 458<br/>10.9.2 Modeling Approaches for Correlated Response . 460<br/>10.9.3 Quasi-Likelihood Approach for Correlated<br/>Binary Response . . . . . . . . . . . . . . . . . . 460<br/>10.9.4 The GEE Method by Liang and Zeger . . . . . . 462<br/>10.9.5 Properties of the GEE Estimate ˆ βG . . . . . . . 463<br/>10.9.6 Efficiency of the GEE and IEE Methods . . . . . 465<br/>10.9.7 Choice of the Quasi-Correlation Matrix Rt(α) . . 465<br/>10.9.8 Bivariate Binary Correlated Response<br/>Variables . . . . . . . . . . . . . . . . . . . . . . 466<br/>10.9.9 The GEEMethod . . . . . . . . . . . . . . . . . 467<br/>10.9.10 The IEEMethod . . . . . . . . . . . . . . . . . . 468<br/>10.9.11 An Example from the Field of Dentistry . . . . . 469<br/>10.9.12 Full Likelihood Approach for Marginal Models . 474<br/>Contents xix<br/>10.10 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 486<br/>A Matrix Algebra 489<br/>A.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . 489<br/>A.2 Trace of aMatrix . . . . . . . . . . . . . . . . . . . . . . 491<br/>A.3 Determinant of aMatrix . . . . . . . . . . . . . . . . . . 492<br/>A.4 Inverse of aMatrix . . . . . . . . . . . . . . . . . . . . . 494<br/>A.5 Orthogonal Matrices . . . . . . . . . . . . . . . . . . . . 495<br/>A.6 Rank of aMatrix . . . . . . . . . . . . . . . . . . . . . . 495<br/>A.7 Range and Null Space . . . . . . . . . . . . . . . . . . . . 496<br/>A.8 Eigenvalues and Eigenvectors . . . . . . . . . . . . . . . . 496<br/>A.9 Decomposition ofMatrices . . . . . . . . . . . . . . . . . 498<br/>A.10 DefiniteMatrices and Quadratic Forms . . . . . . . . . . 501<br/>A.11 IdempotentMatrices . . . . . . . . . . . . . . . . . . . . 507<br/>A.12 Generalized Inverse . . . . . . . . . . . . . . . . . . . . . 508<br/>A.13 Projectors . . . . . . . . . . . . . . . . . . . . . . . . . . 516<br/>A.14 Functions of Normally Distributed Variables . . . . . . . 517<br/>A.15 Differentiation of Scalar Functions of Matrices . . . . . . 520<br/>A.16 Miscellaneous Results, Stochastic Convergence . . . . . . 523<br/>B Tables 527<br/>C Software for Linear Regression Models 531<br/>C.1 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . 531<br/>C.2 Special-Purpose Software . . . . . . . . . . . . . . . . . . 536<br/>C.3 Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . 537<br/>References 539<br/>Index 563
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:alternatives Alternative alterna General Squares

已有 2 人评分学术水平 热心指数 信用等级 收起 理由
aldenhan + 1 + 1 + 1 精彩帖子
sljzhangbiao11 + 1 + 1 + 1 正要学习啊!

总评分: 学术水平 + 2  热心指数 + 2  信用等级 + 2   查看全部评分

hloving 发表于 2009-5-18 10:33:00 |显示全部楼层 |坛友微信交流群

钱不够

使用道具

liu2008shu 发表于 2012-3-1 21:08:37 |显示全部楼层 |坛友微信交流群
有电子版吗???????????

使用道具

wang1fan 发表于 2015-10-30 08:24:06 |显示全部楼层 |坛友微信交流群
我看到的怎么是乱码的啊?

使用道具

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注cda
拉您进交流群

京ICP备16021002-2号 京B2-20170662号 京公网安备 11010802022788号 论坛法律顾问:王进律师 知识产权保护声明   免责及隐私声明

GMT+8, 2024-3-29 03:26