楼主: Lisrelchen
744 3

[Case Study]Bayesian GLMs made easy with PyMC3 [推广有奖]

  • 0关注
  • 62粉丝

VIP

已卖:4194份资源

院士

67%

还不是VIP/贵宾

-

TA的文库  其他...

Bayesian NewOccidental

Spatial Data Analysis

东西方数据挖掘

威望
0
论坛币
50288 个
通用积分
83.6306
学术水平
253 点
热心指数
300 点
信用等级
208 点
经验
41518 点
帖子
3256
精华
14
在线时间
766 小时
注册时间
2006-5-4
最后登录
2022-11-6

楼主
Lisrelchen 发表于 2016-12-18 07:11:22 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币

Author: Thomas Wiecki



In this blog post I will talk about:

  • How the Bayesian Revolution in many scientific disciplines is hindered by poor usability of current Probabilistic Programming languages.
  • A gentle introduction to Bayesian linear regression and how it differs from the frequentist approach.
  • A preview of PyMC3 (currently in alpha) and its new GLM submodule I wrote to allow creation and estimation of Bayesian GLMs as easy as frequentist GLMs in R.

本帖隐藏的内容

Bayesian GLMs made easy with PyMC3.pdf (1.01 MB)




二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:Case study Bayesian Bayes study baye scientific currently creation about blog

沙发
Lisrelchen 发表于 2016-12-18 07:12:00
  1. size = 200
  2. true_intercept = 1
  3. true_slope = 2

  4. x = np.linspace(0, 1, size)
  5. # y = a + b*x
  6. true_regression_line = true_intercept + true_slope * x
  7. # add noise
  8. y = true_regression_line + np.random.normal(scale=.5, size=size)

  9. data = dict(x=x, y=y)
复制代码

藤椅
Lisrelchen 发表于 2016-12-18 07:12:17
  1. fig = plt.figure(figsize=(7, 7))
  2. ax = fig.add_subplot(111, xlabel='x', ylabel='y', title='Generated data and underlying model')
  3. ax.plot(x, y, 'x', label='sampled data')
  4. ax.plot(x, true_regression_line, label='true regression line', lw=2.)
  5. plt.legend(loc=0);
复制代码

板凳
Lisrelchen 发表于 2016-12-18 07:12:59
  1. Here we use the awesome new NUTS sampler (our Inference Button) to draw 2000 posterior samples.

  2. In [ ]:
  3. with Model() as model: # model specifications in PyMC3 are wrapped in a with-statement
  4.     # Define priors
  5.     sigma = HalfCauchy('sigma', beta=10, testval=1.)
  6.     intercept = Normal('Intercept', 0, sd=20)
  7.     x_coeff = Normal('x', 0, sd=20)
  8.    
  9.     # Define likelihood
  10.     likelihood = Normal('y', mu=intercept + x_coeff * x,
  11.                         sd=sigma, observed=y)
  12.    
  13.     # Inference!
  14.     start = find_MAP() # Find starting value by optimization
  15.     step = NUTS(scaling=start) # Instantiate MCMC sampling algorithm
  16.     trace = sample(2000, step, start=start, progressbar=False) # draw 2000 posterior samples using NUTS sampling
复制代码

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2026-1-3 06:30