首页> 外文会议>Design, Automation Test in Europe Conference Exhibition >Context-sensitive timing automata for fast source level simulation
【24h】

Context-sensitive timing automata for fast source level simulation

机译:快速源电平仿真的上下文敏感定时自动机

获取原文

摘要

We present a novel technique for efficient source level timing simulation of embedded software execution on a target platform. In contrast to existing approaches, the proposed technique can accurately approximate time without requiring a dynamic cache model. Thereby the dramatic reduction in simulation performance inherent to dynamic cache modeling is avoided. Consequently, our approach enables an exploitation of the performance potential of source level simulation for complex microarchitectures that include caches. Our approach is based on recent advances in context-sensitive binary level timing simulation. However, a direct application of the binary level approach to source level simulation reduces simulation performance similarly to dynamic cache modeling. To overcome this performance limitation, we contribute a novel pushdown automaton based simulation technique. The proposed context-sensitive timing automata enable an efficient evaluation of complex simulation logic with little overhead. Experimental results show that the proposed technique provides a speed up of an order of magnitude compared to existing context selection techniques and simple source level cache models. Simulation performance is similar to a state of the art accelerated cache simulation. The accelerated simulation is only applicable in specific circumstances, whereas the proposed approach does not suffer this limitation.
机译:我们介绍了一种新颖的目标平台上嵌入式软件执行的高效源级定时仿真技术。与现有方法相比,所提出的技术可以准确地近似时间而不需要动态缓存模型。从而避免了动态高速缓存建模所固有的模拟性能的显着降低。因此,我们的方法使得能够利用包括缓存的复杂微架构的源电平模拟的性能潜力。我们的方法是基于上下文敏感二进制水平定时仿真的最新进步。然而,直接应用二进制水平方法来源电平模拟,与动态高速缓存建模类似地降低了仿真性能。为了克服这种性能限制,我们提供了一种基于新型推动自动化的仿真技术。所提出的上下文敏感时序自动机能够有效地评估复杂仿真逻辑,几乎没有开销。实验结果表明,与现有的上下文选择技术和简单的源电平缓存模型相比,所提出的技术提供了幅度的速度。模拟性能类似于ART加速缓存仿真的状态。加速仿真仅适用于特定情况,而建议的方法不会遭受此限制。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号