楼主: Golden-蓉
10416 7

[一般统计问题] GARCH模型出现错误:flat log likelihood encountered, cannot find uphill direction [推广有奖]

  • 0关注
  • 1粉丝

博士生

12%

还不是VIP/贵宾

-

威望
0
论坛币
404 个
通用积分
0.1221
学术水平
0 点
热心指数
0 点
信用等级
0 点
经验
5417 点
帖子
12
精华
0
在线时间
482 小时
注册时间
2016-2-26
最后登录
2025-12-27

楼主
Golden-蓉 学生认证  发表于 2018-11-25 12:31:23 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
我的数据是上市公司十年4921条研发投入数据,时间有gap,但能保证连续7年及以上有数据,想要做的是对每家公司进行分组回归,构建garch模型,求出研发投入变动趋势,出现错误:flat log likelihood encountered, cannot find uphill direction,我猜测可能原因是数据不够,但是已经发表的文章上用类似的数据也做出了结果。所以我想请问各位大佬有什么解决办法吗?这是我的代码:
  1. use c:\users\administrator\desktop\7year.dta,clear
  2. tsset company year
  3. egen g=group(company)
  4. sum g
  5. local n=r(max)
  6. gen f=.
  7. forv i=1/`n'{
  8. arch rd L.rd if g==`i',arch(1) garch(1)
  9. predict e if e(sample),r
  10. replace f=e if e(sample)
  11. drop e
  12. }
  13. drop g
复制代码

部分结果
  1. Iteration 344: log likelihood = -100.04287  
  2. (switching optimization to BHHH)
  3. Iteration 345: log likelihood =  -100.0428  (not concave)
  4. Iteration 346: log likelihood = -100.04267  (not concave)
  5. Iteration 347: log likelihood = -100.04267  (not concave)
  6. Iteration 348: log likelihood = -100.04267  (not concave)
  7. Iteration 349: log likelihood = -100.04267  (not concave)
  8. (switching optimization to BFGS)
  9. BFGS stepping has contracted, resetting BFGS Hessian (87)
  10. Iteration 350: log likelihood = -100.04267  
  11. Iteration 351: log likelihood = -100.04267  (backed up)
  12. Iteration 352: log likelihood = -100.04266  (backed up)
  13. BFGS stepping has contracted, resetting BFGS Hessian (88)
  14. Iteration 353: log likelihood =  -100.0425  
  15. Iteration 354: log likelihood = -100.04223  (backed up)
  16. Iteration 355: log likelihood = -100.04223  (backed up)
  17. BFGS stepping has contracted, resetting BFGS Hessian (89)
  18. Iteration 356: log likelihood = -100.04142  
  19. Iteration 357: log likelihood = -100.04134  (backed up)
  20. Iteration 358: log likelihood = -100.04133  (backed up)
  21. BFGS stepping has contracted, resetting BFGS Hessian (90)
  22. Iteration 359: log likelihood = -100.04061  
  23. (switching optimization to BHHH)
  24. Iteration 360: log likelihood = -100.04058  (not concave)
  25. Iteration 361: log likelihood = -100.04051  (not concave)
  26. Iteration 362: log likelihood = -100.04051  (not concave)
  27. Iteration 363: log likelihood = -100.04051  (not concave)
  28. Iteration 364: log likelihood = -100.04051  (not concave)
  29. (switching optimization to BFGS)
  30. BFGS stepping has contracted, resetting BFGS Hessian (91)
  31. Iteration 365: log likelihood = -100.04051  
  32. Iteration 366: log likelihood = -100.04051  (backed up)
  33. Iteration 367: log likelihood = -100.04051  (backed up)
  34. BFGS stepping has contracted, resetting BFGS Hessian (92)
  35. Iteration 368: log likelihood =  -100.0403  
  36. Iteration 369: log likelihood = -100.04014  (backed up)
  37. Iteration 370: log likelihood = -100.04014  (backed up)
  38. BFGS stepping has contracted, resetting BFGS Hessian (93)
  39. Iteration 371: log likelihood = -100.03937  
  40. Iteration 372: log likelihood = -100.03935  (backed up)
  41. Iteration 373: log likelihood = -100.03934  (backed up)
  42. BFGS stepping has contracted, resetting BFGS Hessian (94)
  43. Iteration 374: log likelihood =  -100.0388  
  44. (switching optimization to BHHH)
  45. Iteration 375: log likelihood = -100.03878  (not concave)
  46. Iteration 376: log likelihood = -100.03869  (not concave)
  47. Iteration 377: log likelihood = -100.03869  (not concave)
  48. Iteration 378: log likelihood = -100.03869  (not concave)
  49. Iteration 379: log likelihood = -100.03869  (not concave)
  50. (switching optimization to BFGS)
  51. BFGS stepping has contracted, resetting BFGS Hessian (95)
  52. Iteration 380: log likelihood = -100.03869  
  53. Iteration 381: log likelihood = -100.03869  (backed up)
  54. Iteration 382: log likelihood = -100.03869  (backed up)
  55. Iteration 383: log likelihood = -100.03864  
  56. BFGS stepping has contracted, resetting BFGS Hessian (96)
  57. Iteration 384: log likelihood = -100.03859  
  58. Iteration 385: log likelihood = -100.03859  (backed up)
  59. Iteration 386: log likelihood = -100.03857  (backed up)
  60. BFGS stepping has contracted, resetting BFGS Hessian (97)
  61. Iteration 387: log likelihood = -100.03836  
  62. Iteration 388: log likelihood = -100.03826  (backed up)
  63. Iteration 389: log likelihood = -100.03826  (backed up)
  64. (switching optimization to BHHH)
  65. Iteration 390: log likelihood = -100.03755  (not concave)
  66. Iteration 391: log likelihood = -100.03748  (not concave)
  67. Iteration 392: log likelihood = -100.03748  (not concave)
  68. Iteration 393: log likelihood = -100.03748  (not concave)
  69. Iteration 394: log likelihood = -100.03748  (not concave)
  70. (switching optimization to BFGS)
  71. BFGS stepping has contracted, resetting BFGS Hessian (98)
  72. Iteration 395: log likelihood = -100.03748  
  73. Iteration 396: log likelihood = -100.03748  (backed up)
  74. Iteration 397: log likelihood = -100.03746  (backed up)
  75. BFGS stepping has contracted, resetting BFGS Hessian (99)
  76. Iteration 398: log likelihood =  -100.0372  
  77. Iteration 399: log likelihood = -100.03715  (backed up)
  78. Iteration 400: log likelihood = -100.03715  (backed up)
  79. flat log likelihood encountered, cannot find uphill directi
  80. > on
  81. r(430);
复制代码




二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝


数据.xlsx
下载链接: https://bbs.pinggu.org/a-2649392.html

22.91 KB

数据

1.do

254 Bytes

代码

沙发
爱吃铜锣烧的诸葛瑾 发表于 2018-11-25 19:05:18
我也遇到了这个问题,同样也不知道怎么处理。
我看了看Stata 的说明,里面说可以用过ML function 来建立你自己的Garch 模型,然后通过调整其中的option 来试试看之类的。但是我现在还没有弄明白到底怎么搞。愁的要死

藤椅
Golden-蓉 学生认证  发表于 2018-11-25 21:06:49
爱吃铜锣烧的诸葛瑾 发表于 2018-11-25 19:05
我也遇到了这个问题,同样也不知道怎么处理。
我看了看Stata 的说明,里面说可以用过ML function 来建立你 ...
好的,谢谢你啊,我找找看。如果实在不行,我可能打算换方法了

板凳
努力学习的小姚 发表于 2020-2-21 21:55:02
Golden-蓉 发表于 2018-11-25 21:06
好的,谢谢你啊,我找找看。如果实在不行,我可能打算换方法了
你好想问下你这个问题解决了吗

报纸
努力学习的小姚 发表于 2020-2-21 21:55:38
爱吃铜锣烧的诸葛瑾 发表于 2018-11-25 19:05
我也遇到了这个问题,同样也不知道怎么处理。
我看了看Stata 的说明,里面说可以用过ML function 来建立你 ...
你好想问下你这个问题解决了吗

地板
571269707@qq.co 学生认证  发表于 2020-11-2 16:16:14
1.arch解决极大似然值问题
        a. Help arch
        b.  find maximize option
        c. Technique,help technique,arch RDexpenditure, technique(bfgs) arch(1/1) garch(1/1)可以用
        d.   gtolerance(#)
        (999)可以指定为禁用渐变条件。如果优化器遇到重复的“(备份)”消息,梯度可能仍包含大量值,但无法找到可能性的上坡方向。使用此选项,通常可以获得结果,但是否找到了全局最大似然还不清楚。
        
        当最大化不顺利时,可以将最大迭代次数(参见[R]maximize)设置到优化器似乎卡住的点,并在该点检查估计结果。
        e.     from(init_specs)指定系数的初始值。ARCH模型可能对初始值敏感,并且可能具有与局部极大值相对应的系数值。通过一系列回归得到默认起始值,根据渐近理论,得出的结果对于b和ARMA参数是一致的,对于其余参数通常是合理的。然而,这些值可能并不总是可行的,因为似然函数不能在arch首先选择的初始值处求值。在这种情况下,当ARCH和ARMA参数初始化为零时,重新启动估计函数。有可能,但不太可能,即使是这些值也不可行,而且您必须自己提供初始值。

7
571269707@qq.co 学生认证  发表于 2020-11-2 16:17:58
571269707@qq.co 发表于 2020-11-2 16:16
1.arch解决极大似然值问题
        a. Help arch
        b.  find maximize option
我更改了technique、gtolerance设置,同时,改变了迭代次数

8
Meycon 发表于 2021-3-30 08:53:24
571269707@qq.co 发表于 2020-11-2 16:17
我更改了technique、gtolerance设置,同时,改变了迭代次数
具体怎么更改的啊请问

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2026-2-8 06:06