楼主: 秋日私语
2660 8

[Stata高级班] 关于ltolerance选项的设置问题? [推广有奖]

  • 2关注
  • 1粉丝

已卖:460份资源

副教授

45%

还不是VIP/贵宾

-

威望
1
论坛币
9126 个
通用积分
6.8100
学术水平
4 点
热心指数
4 点
信用等级
1 点
经验
22955 点
帖子
375
精华
0
在线时间
794 小时
注册时间
2005-3-31
最后登录
2025-6-13

楼主
秋日私语 发表于 2011-7-10 22:28:51 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
连老师,我在估计ARCH模型的时候,命令为arch x y, earch(1) egarch(1) ltolerance(0.0001),但是对数似然函数值之差小于我的设定0.0001时程序还在运行,这是为什么?另外,我发现arch命令运行太慢,有没有使其运行快的方法。
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:ltolerance Tolerance ance OLE Era 设置 选项 ltolerance

淡定,寻求心灵的宁静

沙发
arlionn 在职认证  发表于 2011-7-11 08:54:48
这应该不会呀,我此前用MLE估计模型时,用过这个选项,都是没有问题的。
你把运行的结果截个图给我看看。

藤椅
秋日私语 发表于 2011-7-11 13:12:45
arch x y , earch(1) egarch(1) het(z) ltolerance(0.0001)

Number of gaps in sample:  8  
(note: conditioning reset at each gap)


(setting optimization to BHHH)
Iteration 0:   log likelihood =  586.42721  
Iteration 1:   log likelihood =  597.54868  
Iteration 2:   log likelihood =  603.71421  
Iteration 3:   log likelihood =  607.30761  
Iteration 4:   log likelihood =   609.5603  
(switching optimization to BFGS)
Iteration 5:   log likelihood =   610.3975  
Iteration 6:   log likelihood =  610.52171  
Iteration 7:   log likelihood =  610.81827  
Iteration 8:   log likelihood =  610.89797  
Iteration 9:   log likelihood =  610.90555  
Iteration 10:  log likelihood =  610.90643  
Iteration 11:  log likelihood =  610.90911  
Iteration 12:  log likelihood =  610.90912  *** 应该在此处就应该停止了

Iteration 13:  log likelihood =  610.90922  
Iteration 14:  log likelihood =  610.90924   
(switching optimization to BHHH)
Iteration 15:  log likelihood =  610.90924  
Iteration 16:  log likelihood =  610.90925  (backed up)   
Iteration 17:  log likelihood =  610.90925  (backed up)
Iteration 18:  log likelihood =  610.90925  (backed up)
Iteration 19:  log likelihood =  610.90925  (backed up)
(switching optimization to BFGS)
Iteration 20:  log likelihood =  610.90925  (backed up)
.................
淡定,寻求心灵的宁静

板凳
秋日私语 发表于 2011-7-11 14:27:15
(setting optimization to BHHH)
Iteration 0:   log likelihood =  439.03055  
Iteration 1:   log likelihood =  450.28815  
Iteration 2:   log likelihood =  465.40009  
Iteration 3:   log likelihood =  467.02226  
Iteration 4:   log likelihood =  467.16146  
(switching optimization to BFGS)
Iteration 5:   log likelihood =  467.67814  
Iteration 6:   log likelihood =  467.82723  
Iteration 7:   log likelihood =   467.8609  
Iteration 8:   log likelihood =  467.89386  
Iteration 9:   log likelihood =  467.89716  
Iteration 10:  log likelihood =  467.89747  
Iteration 11:  log likelihood =  467.89962  
Iteration 12:  log likelihood =  467.89999  
Iteration 13:  log likelihood =  467.90005  
Iteration 14:  log likelihood =  467.90007  (backed up)
(switching optimization to BHHH)
Iteration 15:  log likelihood =   467.9001  
Iteration 16:  log likelihood =  467.90011  (backed up)
Iteration 17:  log likelihood =  467.90011  (backed up)
Iteration 18:  log likelihood =  467.90011  (backed up)
Iteration 19:  log likelihood =  467.90011  (backed up)
(switching optimization to BFGS)
Iteration 20:  log likelihood =  467.90011  (backed up)
Iteration 21:  log likelihood =  467.90011  (backed up)
Iteration 22:  log likelihood =  467.90011  (backed up)
Iteration 23:  log likelihood =  467.90011  (backed up)
Iteration 24:  log likelihood =  467.90011  (backed up)
Iteration 25:  log likelihood =  467.90011  (backed up)
Iteration 26:  log likelihood =  467.90011  (backed up)
Iteration 27:  log likelihood =  467.90011  (backed up)
Iteration 28:  log likelihood =  467.90011  (backed up)
Iteration 29:  log likelihood =  467.90011  (backed up)
(switching optimization to BHHH)
Iteration 30:  log likelihood =  467.90011  (backed up)
Iteration 31:  log likelihood =  467.90011  (backed up)
Iteration 32:  log likelihood =  467.90011  (backed up)
Iteration 33:  log likelihood =  467.90011  (backed up)
Iteration 34:  log likelihood =  467.90011  (backed up)
(switching optimization to BFGS)
Iteration 35:  log likelihood =  467.90011  (backed up)
Iteration 36:  log likelihood =  467.90011  (backed up)
Iteration 37:  log likelihood =  467.90011  (backed up)
Iteration 38:  log likelihood =  467.90011  (backed up)
淡定,寻求心灵的宁静

报纸
秋日私语 发表于 2011-7-11 15:04:19
hehe,我看了help ,可能是由于“If the optimizer becomes stuck with repeated  "(backed up)" messages, the gradient probably still contains substantial values, but an uphill direction cannot be  found for the likelihood.  

所以我就设定gtolerance(999)选项,但是可能不是全局最优,或者设定最大的iterate次数,如iterate(50),但是发现与设定gtolerance(999)选项的结果差异很大。这时候应该怎么办?
淡定,寻求心灵的宁静

地板
arlionn 在职认证  发表于 2011-7-12 08:36:25
秋日私语 发表于 2011-7-11 13:12
arch x y , earch(1) egarch(1) het(z) ltolerance(0.0001)

Number of gaps in sample:  8  
(note: conditioning reset at each gap)


(setting optimization to BHHH)
Iteration 0:   log likelihood =  586.42721  
Iteration 1:   log likelihood =  597.54868  
Iteration 2:   log likelihood =  603.71421  
Iteration 3:   log likelihood =  607.30761  
Iteration 4:   log likelihood =   609.5603  
(switching optimization to BFGS)
Iteration 5:   log likelihood =   610.3975  
Iteration 6:   log likelihood =  610.52171  
Iteration 7:   log likelihood =  610.81827  
Iteration 8:   log likelihood =  610.89797  
Iteration 9:   log likelihood =  610.90555  
Iteration 10:  log likelihood =  610.90643  
Iteration 11:  log likelihood =  610.90911  
Iteration 12:  log likelihood =  610.90912  *** 应该在此处就应该停止了
A: 表面上看,似乎是0.00001,但实际上可能是四舍五入的结果,有可能是 0.000013。

Iteration 13:  log likelihood =  610.90922  
Iteration 14:  log likelihood =  610.90924   
(switching optimization to BHHH)
Iteration 15:  log likelihood =  610.90924  
Iteration 16:  log likelihood =  610.90925  (backed up)   
Iteration 17:  log likelihood =  610.90925  (backed up)
Iteration 18:  log likelihood =  610.90925  (backed up)
Iteration 19:  log likelihood =  610.90925  (backed up)
(switching optimization to BFGS)
Iteration 20:  log likelihood =  610.90925  (backed up)
.................

7
秋日私语 发表于 2011-7-12 11:04:34
我设置的是0.0001,而不是0.00001.
另外
我就设定gtolerance(999)选项,但是可能不是全局最优,或者设定最大的iterate次数,如iterate(50),但是发现与设定gtolerance(999)选项的结果差异很大。这时候应该怎么办?
淡定,寻求心灵的宁静

8
arlionn 在职认证  发表于 2011-7-13 07:55:10
这只能作为无奈之举,因为无法保证全局最优。
可能的处理方式如下:
1. 回到源头看看模型设定是否有问题?
可以选择一个子样本,如某个时间段,测试一下模型是否收敛。
先设定最简单的 GARCH(1, 1)  模型,如果收敛,再逐步分析复杂一点的模型。
2. 看看数据本身是否有问题。
例如,是否包含了严重的离群值或录入错误的观察值。
3. 如果上述设定都没有问题,那就考虑使用一个小一点的收敛判据,比如 0.0001.

9
秋日私语 发表于 2011-7-14 07:26:14
thanks. 呵呵,我试试
淡定,寻求心灵的宁静

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2025-12-30 00:21