楼主: kedemingshi
339 0

[计算机科学] 奇摄动连续时间贝叶斯的降维 网络 [推广有奖]

  • 0关注
  • 4粉丝

会员

学术权威

78%

还不是VIP/贵宾

-

威望
10
论坛币
15 个
通用积分
89.2735
学术水平
0 点
热心指数
8 点
信用等级
0 点
经验
24665 点
帖子
4127
精华
0
在线时间
0 小时
注册时间
2022-2-24
最后登录
2022-4-15

楼主
kedemingshi 在职认证  发表于 2022-4-13 17:15:00 来自手机 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
摘要翻译:
连续时间贝叶斯网络是多分量连续时间马尔可夫过程的有向图表示。网络中的边代表了组件之间的直接影响。多组分过程的联合速率矩阵是由每个组分的条件速率矩阵分别指定的。本文讨论的情况是,一些组件的时间尺度比其他组件的时间尺度短得多。本文证明了在尺度分离为无穷大的极限下,马尔可夫过程收敛于(分布的或弱的)只包含慢分量的约化的或有效的马尔可夫过程。我们还证明了对于合理的尺度分离(一个数量级),约化过程是慢分量上边际过程的一个很好的近似。我们给出了一个简单的过程来构造简化的CTBN,条件速率矩阵可以直接从原来的CTBN中计算出来,并讨论了它对大系统中近似推理的影响。
---
英文标题:
《Dimension Reduction in Singularly Perturbed Continuous-Time Bayesian
  Networks》
---
作者:
Nir Friedman, Raz Kupferman
---
最新提交年份:
2012
---
分类信息:

一级分类:Computer Science        计算机科学
二级分类:Artificial Intelligence        人工智能
分类描述:Covers all areas of AI except Vision, Robotics, Machine Learning, Multiagent Systems, and Computation and Language (Natural Language Processing), which have separate subject areas. In particular, includes Expert Systems, Theorem Proving (although this may overlap with Logic in Computer Science), Knowledge Representation, Planning, and Uncertainty in AI. Roughly includes material in ACM Subject Classes I.2.0, I.2.1, I.2.3, I.2.4, I.2.8, and I.2.11.
涵盖了人工智能的所有领域,除了视觉、机器人、机器学习、多智能体系统以及计算和语言(自然语言处理),这些领域有独立的学科领域。特别地,包括专家系统,定理证明(尽管这可能与计算机科学中的逻辑重叠),知识表示,规划,和人工智能中的不确定性。大致包括ACM学科类I.2.0、I.2.1、I.2.3、I.2.4、I.2.8和I.2.11中的材料。
--

---
英文摘要:
  Continuous-time Bayesian networks (CTBNs) are graphical representations of multi-component continuous-time Markov processes as directed graphs. The edges in the network represent direct influences among components. The joint rate matrix of the multi-component process is specified by means of conditional rate matrices for each component separately. This paper addresses the situation where some of the components evolve on a time scale that is much shorter compared to the time scale of the other components. In this paper, we prove that in the limit where the separation of scales is infinite, the Markov process converges (in distribution, or weakly) to a reduced, or effective Markov process that only involves the slow components. We also demonstrate that for reasonable separation of scale (an order of magnitude) the reduced process is a good approximation of the marginal process over the slow components. We provide a simple procedure for building a reduced CTBN for this effective process, with conditional rate matrices that can be directly calculated from the original CTBN, and discuss the implications for approximate reasoning in large systems.
---
PDF链接:
https://arxiv.org/pdf/1206.6835
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:连续时间 贝叶斯 Presentation distribution Intelligence scale Continuous conditional 推理 multi

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
jg-xs1
拉您进交流群
GMT+8, 2026-1-7 13:17