请选择 进入手机版 | 继续访问电脑版
楼主: kukenghuqian
1379 0

[其他] Bayesian Model Selection and Statistical Modeling [推广有奖]

  • 5关注
  • 31粉丝

人间农夫

院士

12%

还不是VIP/贵宾

-

威望
0
论坛币
132154 个
通用积分
303.2814
学术水平
143 点
热心指数
172 点
信用等级
117 点
经验
55129 点
帖子
1377
精华
0
在线时间
2999 小时
注册时间
2012-9-27
最后登录
2024-4-13

kukenghuqian 发表于 2018-11-16 05:35:37 |显示全部楼层 |坛友微信交流群
相似文件 换一批

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币

Bayesian Model Selection and Statistical Modeling.pdf (5.89 MB, 需要: 10 个论坛币)

捕获.JPG


Contents
Preface xiii
1 Introduction 1
1.1 Statistical models . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Bayesian statistical modeling . . . . . . . . . . . . . . . . . . 6
1.3 Book organization . . . . . . . . . . . . . . . . . . . . . . . . 8
2 Introduction to Bayesian analysis 13
2.1 Probability and Bayes’ theorem . . . . . . . . . . . . . . . . 13
2.2 Introduction to Bayesian analysis . . . . . . . . . . . . . . . 15
2.3 Bayesian inference on statistical models . . . . . . . . . . . . 17
2.4 Sampling density specification . . . . . . . . . . . . . . . . . 19
2.4.1 Probability density specification . . . . . . . . . . . . 19
2.4.2 Econometrics: Quantifying price elasticity of demand . 20
2.4.3 Financial econometrics: Describing a stock market
behavior . . . . . . . . . . . . . . . . . . . . . . . . . . 21
2.4.4 Bioinformatics: Tumor classification with gene
expression data . . . . . . . . . . . . . . . . . . . . . . 22
2.4.5 Psychometrics: Factor analysis model . . . . . . . . . 23
2.4.6 Marketing: Survival analysis model for quantifying
customer lifetime value . . . . . . . . . . . . . . . . . 24
2.4.7 Medical science: Nonlinear logistic regression models . 25
2.4.8 Under the limited computer resources . . . . . . . . . 26
2.5 Prior distribution . . . . . . . . . . . . . . . . . . . . . . . . 26
2.5.1 Diffuse priors . . . . . . . . . . . . . . . . . . . . . . . 26
2.5.2 The Jeffreys’ prior . . . . . . . . . . . . . . . . . . . . 27
2.5.3 Conjugate priors . . . . . . . . . . . . . . . . . . . . . 27
2.5.4 Informative priors . . . . . . . . . . . . . . . . . . . . 27
2.5.5 Other priors . . . . . . . . . . . . . . . . . . . . . . . . 28
2.6 Summarizing the posterior inference . . . . . . . . . . . . . . 28
2.6.1 Point estimates . . . . . . . . . . . . . . . . . . . . . . 28
2.6.2 Interval estimates . . . . . . . . . . . . . . . . . . . . . 29
2.6.3 Densities . . . . . . . . . . . . . . . . . . . . . . . . . 29
2.6.4 Predictive distributions . . . . . . . . . . . . . . . . . 30
2.7 Bayesian inference on linear regression models . . . . . . . . 30
2.8 Bayesian model selection problems . . . . . . . . . . . . . . . 33
viiviii
2.8.1 Example: Subset variable selection problem . . . . . . 33
2.8.2 Example: Smoothing parameter selection problem . . 35
2.8.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . 37
3 Asymptotic approach for Bayesian inference 43
3.1 Asymptotic properties of the posterior distribution . . . . . . 43
3.1.1 Consistency . . . . . . . . . . . . . . . . . . . . . . . . 43
3.1.2 Asymptotic normality of the posterior mode . . . . . . 44
3.1.3 Example: Asymptotic normality of the posterior mode
of logistic regression . . . . . . . . . . . . . . . . . . . 45
3.2 Bayesian central limit theorem . . . . . . . . . . . . . . . . . 46
3.2.1 Bayesian central limit theorem . . . . . . . . . . . . . 47
3.2.2 Example: Poisson distribution with conjugate prior . . 49
3.2.3 Example: Confidence intervals . . . . . . . . . . . . . . 50
3.3 Laplace method . . . . . . . . . . . . . . . . . . . . . . . . . 51
3.3.1 Laplace method for integral . . . . . . . . . . . . . . . 51
3.3.2 Posterior expectation of a function of parameter . . . 53
3.3.3 Example: Bernoulli distribution with a uniform prior . 55
3.3.4 Asymptotic approximation of the Bayesian predictive
distribution . . . . . . . . . . . . . . . . . . . . . . . . 57
3.3.5 Laplace method for approximating marginal posterior
distribution . . . . . . . . . . . . . . . . . . . . . . . . 58
4 Computational approach for Bayesian inference 63
4.1 Monte Carlo integration . . . . . . . . . . . . . . . . . . . . . 63
4.2 Markov chain Monte Carlo methods for Bayesian
inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
4.2.1 Gibbs sampler . . . . . . . . . . . . . . . . . . . . . . 65
4.2.2 Metropolis-Hastings sampler . . . . . . . . . . . . . . 65
4.2.3 Convergence check . . . . . . . . . . . . . . . . . . . . 67
4.2.4 Example: Gibbs sampling for seemingly unrelated
regression model . . . . . . . . . . . . . . . . . . . . . 68
4.2.5 Example: Gibbs sampling for auto-correlated errors . . 73
4.3 Data augmentation . . . . . . . . . . . . . . . . . . . . . . . 76
4.3.1 Probit model . . . . . . . . . . . . . . . . . . . . . . . 76
4.3.2 Generating random samples from the truncated
normal density . . . . . . . . . . . . . . . . . . . . . . 78
4.3.3 Ordered probit model . . . . . . . . . . . . . . . . . . 79
4.4 Hierarchical modeling . . . . . . . . . . . . . . . . . . . . . . 81
4.4.1 Lasso . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
4.4.2 Gibbs sampling for Bayesian Lasso . . . . . . . . . . . 82
4.5 MCMC studies for the Bayesian inference on various types of
models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
4.5.1 Volatility time series models . . . . . . . . . . . . . . . 83
4.5.2 Simultaneous equation model . . . . . . . . . . . . . . 84ix
4.5.3 Quantile regression . . . . . . . . . . . . . . . . . . . . 86
4.5.4 Graphical models . . . . . . . . . . . . . . . . . . . . . 88
4.5.5 Multinomial probit models . . . . . . . . . . . . . . . 88
4.5.6 Markov switching models . . . . . . . . . . . . . . . . 90
4.6 Noniterative computation methods for Bayesian
inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
4.6.1 The direct Monte Carlo . . . . . . . . . . . . . . . . . 93
4.6.2 Importance sampling . . . . . . . . . . . . . . . . . . . 94
4.6.3 Rejection sampling . . . . . . . . . . . . . . . . . . . . 95
4.6.4 Weighted bootstrap . . . . . . . . . . . . . . . . . . . 96
5 Bayesian approach for model selection 101
5.1 General framework . . . . . . . . . . . . . . . . . . . . . . . 101
5.2 Definition of the Bayes factor . . . . . . . . . . . . . . . . . . 103
5.2.1 Example: Hypothesis testing 1 . . . . . . . . . . . . . 104
5.2.2 Example: Hypothesis testing 2 . . . . . . . . . . . . . 105
5.2.3 Example: Poisson models with conjugate priors . . . 106
5.3 Exact calculation of the marginal likelihood . . . . . . . . . . 108
5.3.1 Example: Binomial model with conjugate prior . . . . 108
5.3.2 Example: Normal regression model with conjugate prior
and Zellner’s g-prior . . . . . . . . . . . . . . . . . . . 109
5.3.3 Example: Multi-response normal regression model . . 111
5.4 Laplace’s method and asymptotic approach for computing the
marginal likelihood . . . . . . . . . . . . . . . . . . . . . . . 113
5.5 Definition of the Bayesian information criterion . . . . . . . 115
5.5.1 Example: Evaluation of the approximation error . . . 116
5.5.2 Example: Link function selection for binomial
regression . . . . . . . . . . . . . . . . . . . . . . . . . 116
5.5.3 Example: Selecting the number of factors in factor
analysis model . . . . . . . . . . . . . . . . . . . . . . 118
5.5.4 Example: Survival analysis . . . . . . . . . . . . . . . 121
5.5.5 Consistency of the Bayesian information criteria . . . 124
5.6 Definition of the generalized Bayesian information criterion . 125
5.6.1 Example: Nonlinear regression models using basis
expansion predictors . . . . . . . . . . . . . . . . . . . 126
5.6.2 Example: Multinomial logistic model with basis
expansion predictors . . . . . . . . . . . . . . . . . . . 132
5.7 Bayes factor with improper prior . . . . . . . . . . . . . . . 141
5.7.1 Intrinsic Bayes factors . . . . . . . . . . . . . . . . . . 142
5.7.2 Partial Bayes factor and fractional Bayes factor . . . . 146
5.7.3 Posterior Bayes factors . . . . . . . . . . . . . . . . . . 147
5.7.4 Pseudo Bayes factors based on cross validation . . . . 148
5.7.4.1 Example: Bayesian linear regression model
with improper prior . . . . . . . . . . . . . . 148x
5.8 Expected predictive likelihood approach for Bayesian model
selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
5.8.1 Predictive likelihood for model selection . . . . . . . . 150
5.8.2 Example: Normal model with conjugate prior . . . . . 152
5.8.3 Example: Bayesian spatial modeling . . . . . . . . . . 152
5.9 Other related topics . . . . . . . . . . . . . . . . . . . . . . . 155
5.9.1 Bayes factors when model dimension grows . . . . . . 155
5.9.2 Bayesian p-values . . . . . . . . . . . . . . . . . . . . . 156
5.9.3 Bayesian sensitivity analysis . . . . . . . . . . . . . . 157
5.9.3.1 Example: Sensitivity analysis of Value at Risk 158
5.9.3.2 Example: Bayesian change point analysis . . 160
6 Simulation approach for computing the marginal likelihood 169
6.1 Laplace-Metropolis approximation . . . . . . . . . . . . . . . 169
6.1.1 Example: Multinomial probit models . . . . . . . . . 170
6.2 Gelfand-Day’s approximation and the harmonic mean estima-
tor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
6.2.1 Example: Bayesian analysis of the ordered probit model 172
6.3 Chib’s estimator from Gibb’s sampling . . . . . . . . . . . . 174
6.3.1 Example: Seemingly unrelated regression model with
informative prior . . . . . . . . . . . . . . . . . . . . . 176
6.3.1.1 Calculation of the marginal likelihood . . . . 177
6.4 Chib’s estimator from MH sampling . . . . . . . . . . . . . . 179
6.5 Bridge sampling methods . . . . . . . . . . . . . . . . . . . . 181
6.6 The Savage-Dickey density ratio approach . . . . . . . . . . . 182
6.6.1 Example: Bayesian linear regression model . . . . . . 182
6.7 Kernel density approach . . . . . . . . . . . . . . . . . . . . 185
6.7.1 Example: Bayesian analysis of the probit model . . . 185
6.8 Direct computation of the posterior model probabilities . . . 187
6.8.1 Reversible jump MCMC . . . . . . . . . . . . . . . . . 187
6.8.2 Example: Reversible jump MCMC for seemingly
unrelated regression model with informative prior . . . 188
6.8.3 Product space search and metropolized product space
search . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
6.8.4 Bayesian variable selection for large model space . . . 192
7 Various Bayesian model selection criteria 199
7.1 Bayesian predictive information criterion . . . . . . . . . . . 199
7.1.1 The posterior mean of the log-likelihood and the
expected log-likelihood . . . . . . . . . . . . . . . . . . 199
7.1.2 Bias correction for the posterior mean of the log-
likelihood . . . . . . . . . . . . . . . . . . . . . . . . . 201
7.1.3 Definition of the Bayesian predictive information
criterion . . . . . . . . . . . . . . . . . . . . . . . . . . 201
7.1.4 Example: Bayesian generalized state space modeling . 204xi
7.2 Deviance information criterion . . . . . . . . . . . . . . . . . 214
7.2.1 Example: Hierarchical Bayesian modeling for logistic
regression . . . . . . . . . . . . . . . . . . . . . . . . . 215
7.3 A minimum posterior predictive loss approach . . . . . . . . 216
7.4 Modified Bayesian information criterion . . . . . . . . . . . . 218
7.4.1 Example: P -spline regression model with Gaussian
noise . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220
7.4.2 Example: P -spline logistic regression . . . . . . . . . 221
7.5 Generalized information criterion . . . . . . . . . . . . . . . 222
7.5.1 Example: Heterogeneous error model for the analysis
motorcycle impact data . . . . . . . . . . . . . . . . . 226
7.5.2 Example: Microarray data analysis . . . . . . . . . . 227
8 Theoretical development and comparisons 235
8.1 Derivation of Bayesian information criteria . . . . . . . . . . 235
8.2 Derivation of generalized Bayesian information criteria . . . 237
8.3 Derivation of Bayesian predictive information criterion . . . 238
8.3.1 Derivation of BPIC . . . . . . . . . . . . . . . . . . . . 239
8.3.2 Further simplification of BPIC . . . . . . . . . . . . . 243
8.4 Derivation of generalized information criterion . . . . . . . . 245
8.4.1 Information theoretic approach . . . . . . . . . . . . . 245
8.4.2 Derivation of GIC . . . . . . . . . . . . . . . . . . . . 248
8.5 Comparison of various Bayesian model selection criteria . . . 250
8.5.1 Utility function . . . . . . . . . . . . . . . . . . . . . . 250
8.5.2 Robustness to the improper prior . . . . . . . . . . . . 252
8.5.3 Computational cost . . . . . . . . . . . . . . . . . . . 252
8.5.4 Estimation methods . . . . . . . . . . . . . . . . . . . 253
8.5.5 Misspecified models . . . . . . . . . . . . . . . . . . . 253
8.5.6 Consistency . . . . . . . . . . . . . . . . . . . . . . . . 253
9 Bayesian model averaging 257
9.1 Definition of Bayesian model averaging . . . . . . . . . . . . 257
9.2 Occam’s window method . . . . . . . . . . . . . . . . . . . . 259
9.3 Bayesian model averaging for linear regression models . . . . 260
9.4 Other model averaging methods . . . . . . . . . . . . . . . . 261
9.4.1 Model averaging with AIC . . . . . . . . . . . . . . . . 262
9.4.2 Model averaging with predictive likelihood . . . . . . . 262
Bibliography 265
Index 285

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝


您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群

京ICP备16021002-2号 京B2-20170662号 京公网安备 11010802022788号 论坛法律顾问:王进律师 知识产权保护声明   免责及隐私声明

GMT+8, 2024-4-19 09:57