楼主: 东西方咨询
4499 14

[一本引起争议的贝叶斯专著]The Bayesian Core: [推广有奖]

  • 0关注
  • 1粉丝

已卖:347份资源

博士生

73%

还不是VIP/贵宾

-

TA的文库  其他...

Research Paper Writing(写作)

OxMetrics NewOccidental

Eviews NewOccidental

威望
0
论坛币
2817 个
通用积分
6.1195
学术水平
48 点
热心指数
19 点
信用等级
46 点
经验
4060 点
帖子
115
精华
4
在线时间
7 小时
注册时间
2014-6-21
最后登录
2016-8-20

楼主
东西方咨询 发表于 2014-6-22 10:37:20 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币

Bayesian Core: A Practical Approach to Computational Bayesian Statistics

Jean-Michel Marin (Author), Christian Robert (Author)

This Bayesian modeling book is intended for practitioners and applied statisticians looking for a self-contained entry to computational Bayesian statistics. Focusing on standard statistical models and backed up by discussed real datasets available from the book website, it provides an operational methodology for conducting Bayesian inference, rather than focusing on its theoretical justifications. Special attention is paid to the derivation of prior distributions in each case and specific reference solutions are given for each of the models. Similarly, computational details are worked out to lead the reader towards an effective programming of the methods given in the book.
Product Details
  • Hardcover: 258 pages
  • Publisher: Springer; 1st ed. 2007. Corr. 2nd printing 2007 edition (Feb. 18 2007)
  • Language: English
  • ISBN-10: 0387389792
  • ISBN-13: 978-0387389790
  • Product Dimensions: 23.6 x 15.5 x 2.3 cm
  • Shipping Weight: 295 g
Product Description

Review:
"The matching of each computational technique to a real data set allows readers to fully appreciate the Bayesian analysis process, from model formation to prior selection and practical implementation." (Lawrence Joseph from Biometrics, Issue 63, September 2007)


"Recent times have seen several new books introducing Bayesian computing. This book is an introduction on a higher level. ‘The purpose of this book is to provide a self-contained entry to practical & computational Bayesian Statistics using generic examples from the most common models.’ … Many researchers and Ph.D. students will find the R-programs in the book a nice start for their own problems and an innovative source for further developments." (Wolfgang Polasek, Statistical Papers, Vol. 49, 2008)

"This text intentionally focuses on a few fundamental Bayesian statistical models and key computational tools. … Bayesian Core is more than a textbook: it is an entire course carefully crafted with the student in mind. … As an instructor of Bayesian statistics courses, I was pleased to discover this ready- and well-made, self-contained introductory course for (primarily) graduate students in statistics and other quantitative disciplines. I am seriously considering Bayesian Core for my next course in Bayesian statistics." (Jarrett J. Barber, Journal of the American Statistical Association, Vol. 103 (481), 2008)


"The book aims to be a self-contained entry to Bayesian computational statistics for practitioners as well as students at both the graduate and undergraduate level, and has been test-driven in a number of courses given by the authors. … Two particularly attractive aspects of the book are its concise and clear writing style, which is really enjoyable, and its focus on the development of an intuitive feel for the material: the numerous insightful remarks should make the book a real treat … ." (Pieter Bastiaan Ober, Journal of Applied Statistics, Vol. 35 (1), 2008)
"The book is a good, compact and self-contained introduction to the applications of Bayesian statistics and to the use of R to implement the procedures. … a reader with a previous formal course in statistics will enjoy reading this book. … the authors are not shy of presenting such complex models as hidden Markov models and Markov random fields in a simple and direct way. This adds an edge to a compact and useful text." (Mauro Gasparini, Zentralblatt MATH, Vol. 1137 (15), 2008)


"This book’s title captures its focus. It is a textbook covering the core statistical models from both a Bayesian viewpoint and a computational viewpoint. … There is a discussion of choice of priors, along with math to derive the priors. … The book is being actively used as a textbook by a number of university courses. … The course level is graduate or advanced undergraduate. Solutions to the exercises are available to course instructors … . In conclusion, the book does what it does, well." (Rohan Baxter, ACM Computing Reviews, December, 2008)


From the Back Cover
This Bayesian modeling book is intended for practitioners and applied statisticians looking for a self-contained entry to computational Bayesian statistics. Focusing on standard statistical models and backed up by discussed real datasets available from the book website, it provides an operational methodology for conducting Bayesian inference, rather than focusing on its theoretical justifications. Special attention is paid to the derivation of prior distributions in each case and specific reference solutions are given for each of the models. Similarly, computational details are worked out to lead the reader towards an effective programming of the methods given in the book. While R programs are provided on the book website and R hints are given in the computational sections of the book, The Bayesian Core requires no knowledge of the R language and it can be read and used with any other programming language.

The Bayesian Core can be used as a textbook at both undergraduate and graduate levels, as exemplified by courses given at Université Paris Dauphine (France), University of Canterbury (New Zealand), and University of British Columbia (Canada). It serves as a unique textbook for a service course for scientists aiming at analyzing data the Bayesian way as well as an introductory course on Bayesian statistics. The prerequisites for the book are a basic knowledge of probability theory and of statistics. Methodological and data-based exercises are included within the main text and students are expected to solve them as they read the book. Those exercises can obviously serve as assignments, as was done in the above courses. Datasets, R codes and course slides all are available on the book website.

Jean-Michel Marin is currently senior researcher at INRIA, the French Computer Science research institute, and located at Université Paris-Sud, Orsay. He has previously been Assistant Professor at Université Paris Dauphine for four years. He has written numerous papers on Bayesian methodology and computing, and is currently a member of the council of the French Statistical Society.

Christian Robert is Professor of Statistics at Université Paris Dauphine and Head of the Statistics Research Laboratory at CREST-INSEE, Paris. He has written over a hundred papers on Bayesian Statistics and computational methods and is the author or co-author of seven books on those topics, including The Bayesian Choice (Springer, 2001), winner of the ISBA DeGroot Prize in 2004. He is a Fellow and member of the council of the Institute of Mathematical Statistics, and a Fellow and member of the research committee of the Royal Statistical Society. He is currently co-editor of the Journal of the Royal Statistical Society, Series B, after taking part in the editorial boards of the Journal of the American Statistical Society, the Annals of Statistics, Statistical Science, andBayesian Analysis. He is also the winner of the Young Statistician prize of the Paris Statistical Society in 1996 and a recipient of an Erskine Fellowship from the University of Canterbury (NZ) in 2006.


本帖隐藏的内容

The Bayesian Core.rar (7.68 MB, 需要: 10 个论坛币) 本附件包括:
  • The Bayesian Core.pdf


https://www.ceremade.dauphine.fr/~xian/BCS/
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:Bayesian Bayes core baye cor Christian available intended provides standard

本帖被以下文库推荐

沙发
农村固定观察点(未真实交易用户) 发表于 2014-6-22 11:28:21
  1. # CHAPTER 2: R COMMANDS
  2. # 19/12/2006

  3. ################################################################################

  4. # normaldata DATASET IMPLEMENTATION AND GRAPHS

  5. normaldata=scan("normaldata")
  6. hist(normaldata,nclass=20,col="blue")

  7. ################################################################################

  8. # CMBdata DATASET IMPLEMENTATION AND GRAPHS

  9. CMBdata=scan("CMBdata")
  10. CMBdatamat=matrix(CMBdata,800,800,byrow=TRUE)
  11. image(CMBdatamat,col=grey(1:1000/1000))
  12. hist(CMBdata,nclass=100,col="red",prob=TRUE)
  13. f=function(x){dnorm(x,mean(CMBdata),sd(CMBdata))}
  14. curve(f,-0.2,0.8,add=TRUE,col="blue")

  15. ################################################################################

  16. # normaldata DATASET COMPUTATION OF EACH PREDICTIVE CDF

  17. n=length(normaldata)
  18. outl=rep(0,n)
  19. outf=outl
  20. for (i in 1:n)
  21. {
  22. outl[i]=pt((normaldata[i]-mean(normaldata[normaldata[]!=normaldata[i]]))/
  23. (sd(normaldata[normaldata[]!=normaldata[i]])*sqrt(91/90)),90)
  24. outf[i]=dt((normaldata[i]-mean(normaldata[normaldata[]!=normaldata[i]]))/
  25. (sd(normaldata[normaldata[]!=normaldata[i]])*sqrt(91/90)),90)/
  26. (sd(normaldata[normaldata[]!=normaldata[i]])*sqrt(91/90))
  27. }

  28. plot(c(0,1),c(0,1),lwd=2,ylab="Predictive",xlab="Uniform",type="l")
  29. points(seq(1/91,90/91,length=90),sort(outl),pch=19,col="steelblue3")
  30. points(seq(1/91,90/91,length=90),sort(runif(90)),pch=21,col="tomato")

  31. ################################################################################
复制代码

藤椅
Lisrelchen(未真实交易用户) 发表于 2014-6-22 12:10:23
  1. # CHAPTER 3: R COMMANDS
  2. # 19/12/2006

  3. ################################################################################

  4. # caterpillar DATASET IMPLEMENTATION

  5. source("#3.R")
  6. processio=read.table("caterpillar")
  7. y=log(processio$V11)
  8. X=as.matrix(cbind(rep(1,33),processio[,1:10]))
  9. n=length(y)
  10. k=dim(X)[2]

  11. ################################################################################

  12. # caterpillar DATASET: ORDINARY LEAST SQUARE ESTIMATION OF beta

  13. betahat=inv(t(X)%*%X,tol=10e-20)%*%t(X)%*%y
  14. betahat

  15. ################################################################################

  16. # caterpillar DATASET: UNBIASED ESTIMATION OF sigma2

  17. S2=t(y-X%*%betahat)%*%(y-X%*%betahat)
  18. sigma2hat=S2/(n-k)
  19. sigma2hat

  20. ################################################################################

  21. # caterpillar DATASET: ESTIMATION OF THE VARIANCE OF betahat

  22. diag(as.real(sigma2hat)*inv(t(X)%*%X))

  23. ################################################################################

  24. # CONJUGATE PRIOR ANALYSIS

  25. a=2.1
  26. b=2

  27. # PRIOR MEAN OF sigma2

  28. 2/(2.1-1)

  29. # PRIOR VARIANCE OF sigma2

  30. 2^2/((2.1-1)^2*(2.1-2))

  31. # caterpillar DATASET: POSTERIOR MEANS OF sigma2 FOR DIFFERENT c

  32. # c=0.1

  33. M=10*diag(11)
  34. T=inv(inv(M)+inv(t(X)%*%X))
  35. (2*b+S2+t(betahat)%*%T%*%betahat)/(n+2*a-2)

  36. # c=1

  37. M=diag(11)
  38. T=inv(inv(M)+inv(t(X)%*%X))
  39. (2*b+S2+t(betahat)%*%T%*%betahat)/(n+2*a-2)

  40. # c=10

  41. M=0.1*diag(11)
  42. T=inv(inv(M)+inv(t(X)%*%X))
  43. (2*b+S2+t(betahat)%*%T%*%betahat)/(n+2*a-2)

  44. # c=100

  45. M=0.01*diag(11)
  46. T=inv(inv(M)+inv(t(X)%*%X))
  47. (2*b+S2+t(betahat)%*%T%*%betahat)/(n+2*a-2)

  48. # c=1000

  49. M=0.001*diag(11)
  50. T=inv(inv(M)+inv(t(X)%*%X))
  51. (2*b+S2+t(betahat)%*%T%*%betahat)/(n+2*a-2)

  52. # caterpillar DATASET: POSTERIOR MEANS OF beta FOR DIFFERENT c

  53. # c=0.1

  54. M=10*diag(11)
  55. b1beta=inv(M+t(X)%*%X)%*%t(X)%*%y
  56. b1beta[1]

  57. # c=1

  58. M=diag(11)
  59. b2beta=inv(M+t(X)%*%X)%*%t(X)%*%y
  60. b2beta[1]

  61. # c=10

  62. M=0.1*diag(11)
  63. b3beta=inv(M+t(X)%*%X)%*%t(X)%*%y
  64. b3beta[1]

  65. # c=100

  66. M=0.01*diag(11)
  67. b4beta=inv(M+t(X)%*%X)%*%t(X)%*%y
  68. b4beta[1]

  69. # c=1000

  70. M=0.001*diag(11)
  71. b5beta=inv(M+t(X)%*%X)%*%t(X)%*%y
  72. b5beta[1]

  73. # caterpillar DATASET: POSTERIOR VARIANCES OF THE FIRST COMPONENT OF beta
  74. # FOR DIFFERENT c

  75. # c=0.1

  76. M=10*diag(11)
  77. T=inv(inv(M)+inv(t(X)%*%X))
  78. E=as.real(2*b+S2+t(betahat)%*%T%*%betahat)/(n+2*a-2)*inv(M+t(X)%*%X)
  79. E[1,1]

  80. # c=1

  81. M=diag(11)
  82. T=inv(inv(M)+inv(t(X)%*%X))
  83. E=as.real(2*b+S2+t(betahat)%*%T%*%betahat)/(n+2*a-2)*inv(M+t(X)%*%X)
  84. E[1,1]

  85. # c=10

  86. M=0.1*diag(11)
  87. T=inv(inv(M)+inv(t(X)%*%X))
  88. E=as.real(2*b+S2+t(betahat)%*%T%*%betahat)/(n+2*a-2)*inv(M+t(X)%*%X)
  89. E[1,1]

  90. # c=100

  91. M=0.01*diag(11)
  92. T=inv(inv(M)+inv(t(X)%*%X))
  93. E=as.real(2*b+S2+t(betahat)%*%T%*%betahat)/(n+2*a-2)*inv(M+t(X)%*%X)
  94. E[1,1]

  95. # c=1000

  96. M=0.001*diag(11)
  97. T=inv(inv(M)+inv(t(X)%*%X))
  98. E=as.real(2*b+S2+t(betahat)%*%T%*%betahat)/(n+2*a-2)*inv(M+t(X)%*%X)
  99. E[1,1]

  100. ################################################################################

  101. # NON-INFORMATIVE PRIOR ANALYSIS

  102. # caterpillar DATASET: 0.95 HPD LOWER BOUND OF beta

  103. betahat-qt(0.95,22)*sqrt(diag(as.real(sigma2hat)*inv(t(X)%*%X)))

  104. # caterpillar DATASET: 0.95 HPD UPPER BOUND OF beta

  105. betahat+qt(0.95,22)*sqrt(diag(as.real(sigma2hat)*inv(t(X)%*%X)))

  106. ################################################################################

  107. # ZELLNER INFORMATIVE G-PRIOR ANALYSIS

  108. # caterpillar DATASET: POSTERIOR MEAN OF beta FOR c=100

  109. 100/101*betahat

  110. # caterpillar DATASET: POSTERIOR VARIANCE OF beta FOR c=100

  111. diag(100/(33*101)*as.real(S2+t(betahat)%*%t(X)%*%X%*%betahat/101)*inv(t(X)%*%X))

  112. # BAYES FACTOR

  113. X0=X[,-c(8,9)]
  114. P0=X0%*%inv(t(X0)%*%X0)%*%t(X0)
  115. lulu0=101^(-9/2)*(t(y)%*%y-100/101*t(y)%*%P0%*%y)^(-33/2)

  116. P=X%*%inv(t(X)%*%X)%*%t(X)
  117. lulu=101^(-11/2)*(t(y)%*%y-100/101*t(y)%*%P%*%y)^(-33/2)

  118. log10(lulu0/lulu)

  119. ################################################################################

  120. # caterpillar DATASET: ZELLNER NON-INFORMATIVE G-PRIOR ANALYSIS

  121. cc=1:100000
  122. sum(cc/(cc+1)*betahat[1]*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))
  123. sum(cc/(cc+1)*betahat[2]*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))
  124. sum(cc/(cc+1)*betahat[3]*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))
  125. sum(cc/(cc+1)*betahat[4]*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))
  126. sum(cc/(cc+1)*betahat[5]*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))
  127. sum(cc/(cc+1)*betahat[6]*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))
  128. sum(cc/(cc+1)*betahat[7]*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))
  129. sum(cc/(cc+1)*betahat[8]*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))
  130. sum(cc/(cc+1)*betahat[9]*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))
  131. sum(cc/(cc+1)*betahat[10]*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))
  132. sum(cc/(cc+1)*betahat[11]*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))

  133. sum((S2+t(betahat)%*%t(X)%*%X%*%betahat/(cc+1))/(n-2)*cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))/sum(cc^(-1)*(cc+1)^(-11/2)*(t(y)%*%y-cc/(cc+1)*t(y)%*%P%*%y)^(-33/2))

  134. ################################################################################

  135. # VARIABLE SELECTION

  136. library(combinat)
  137. GAMPRO=matrix(0,1024,10)
  138. qq=cumsum(c(choose(10,0:10)))
  139. for (k in 1:10)
  140. {
  141. tab=t(combn(10,k))
  142. GAMPRO[(qq[k]+1):qq[k+1],]=t(apply(tab,1,invt1,p=10))
  143. }
  144. rm(qq,tab)

  145. # caterpillar DATASET: CACULATION OF THE MODEL POSTERIOR PROBABILITIES
  146. # UNDER ZELLNER NON-INFORMATIVE G-PRIOR

  147. yp=y
  148. Xp=as.matrix(X[,2:11])

  149. PRO1NIP=rep(0,1024)
  150. for (k in 1:1024) {PRO1NIP[k]=lpostwnoinf(GAMPRO[k,],yp,Xp); print(k)}
  151. PRO1NIP=exp(PRO1NIP-max(PRO1NIP))
  152. PRO1NIP=PRO1NIP/sum(PRO1NIP)
  153. MONIP=order(PRO1NIP)[1024:1005]
  154. TOPPRO1NIP=cbind(apply(GAMPRO[MONIP,],1,newt1),PRO1NIP[MONIP])
  155. rm(MONIP)

  156. # caterpillar DATASET: GIBBS SAMPLING UNDER ZELLNER NON-INFORMATIVE G-PRIOR

  157. GIBBSPRO1NIP=gibbsNIP2(20000,yp,Xp,PRO1NIP)
  158. RGIBBSPRO1NIP=apply(GIBBSPRO1NIP[10001:20000,],1,newt1)
  159. RGIBBSPRO1NIP=summary(as.factor(RGIBBSPRO1NIP))/10000

  160. # caterpillar DATASET: CACULATION OF THE MODEL POSTERIOR PROBABILITIES
  161. # UNDER ZELLNER INFORMATIVE G-PRIOR

  162. PRO1IP2=rep(0,1024)
  163. for (k in 1:1024) {PRO1IP2[k]=lpostw(GAMPRO[k,],yp,Xp,rep(0,11),100); print(k)}
  164. PRO1IP2=exp(PRO1IP2-max(PRO1IP2))
  165. PRO1IP2=PRO1IP2/sum(PRO1IP2)
  166. MOIP2=order(PRO1IP2)[1024:1005]
  167. TOPPRO1IP2=cbind(apply(GAMPRO[MOIP2,],1,newt1),PRO1IP2[MOIP2])
  168. rm(MOIP2)

  169. # caterpillar DATASET: GIBBS SAMPLING UNDER ZELLNER INFORMATIVE G-PRIOR

  170. GIBBSPRO1IP2=gibbsIP(20000,yp,Xp,rep(0,11),100)
  171. RGIBBSPRO1IP2=apply(GIBBSPRO1IP2[10001:20000,],1,newt1)
  172. RGIBBSPRO1IP2=summary(as.factor(RGIBBSPRO1IP2))/10000

  173. ################################################################################
复制代码

板凳
0jzhang(真实交易用户) 发表于 2014-6-22 12:47:02
Bayesian Core: A Practical Approach to Computational Bayesian Statistics

报纸
shede6688(真实交易用户) 发表于 2014-6-22 15:15:45
下来看看

地板
songlinjl(真实交易用户) 发表于 2014-6-22 17:45:47
所争为何?拿来看看。

7
wenhai66(真实交易用户) 发表于 2014-6-22 18:08:50
一本引起争议的贝叶斯专著?

8
再把相思寄巫山(未真实交易用户) 发表于 2014-6-24 16:56:20
thanks for sharing

9
jack_dull(未真实交易用户) 发表于 2014-6-28 11:15:38
谢谢提供这么好的书

10
xibei2008(真实交易用户) 发表于 2014-7-3 15:28:36

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2025-12-20 16:58