首页> 外文期刊>Journal of Applied Probability >AUTOMATED STATE-DEPENDENT IMPORTANCE SAMPLING FOR MARKOV JUMP PROCESSES VIA SAMPLING FROM THE ZERO-VARIANCE DISTRIBUTION
【24h】

AUTOMATED STATE-DEPENDENT IMPORTANCE SAMPLING FOR MARKOV JUMP PROCESSES VIA SAMPLING FROM THE ZERO-VARIANCE DISTRIBUTION

机译:通过零方差分布进行抽样的马尔可夫跳跃过程的状态相关自动抽样

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Many complex systems can be modeled via Markov jump processes. Applications include chemical reactions, population dynamics, and telecommunication networks. Rare-event estimation for such models can be difficult and is often computationally expensive, because typically many (or very long) paths of the Markov jump process need to be simulated in order to observe the rare event. We present a state-dependent importance sampling approach to this problem that is adaptive and uses Markov chain Monte Carlo to sample from the zero-variance importance sampling distribution. The method is applicable to a wide range of Markov jump processes and achieves high accuracy, while requiring only a small sample to obtain the importance parameters. We demonstrate its efficiency through benchmark examples in queueing theory and stochastic chemical kinetics.
机译:许多复杂的系统可以通过马尔可夫跳跃过程进行建模。应用包括化学反应,种群动态和电信网络。由于通常需要模拟许多(或很长)马尔可夫跳跃过程的路径以观察稀有事件,因此此类模型的稀有事件估计可能会很困难并且通常在计算上昂贵。我们针对此问题提出了一种状态相关的重要性抽样方法,该方法是自适应的,并使用马尔可夫链蒙特卡罗方法从零方差重要性抽样分布中进行抽样。该方法适用于广泛的马尔可夫跳跃过程,并实现了高精度,同时仅需少量样本即可获得重要参数。我们通过排队论和随机化学动力学中的基准示例来证明其效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号