首页> 外文会议>IEEE International Confernece on Computer Design >Extending Data Prefetching to Cope with Context Switch Misses
【24h】

Extending Data Prefetching to Cope with Context Switch Misses

机译:扩展数据预取以应对Context Switch未命中

获取原文

摘要

Among the various costs of a context switch, its impact on the performance of L2 caches is the most significant because of the resulting high miss penalty. To reduce the impact of frequent context switches, we propose restoring a program's locality by prefetching into the L2 cache the data a program was using before it was swapped out. A Global History List is used to record a process' L2 read accesses in LRU order. These accesses are saved along with the process' context when the process is swapped out and loaded to guide prefetching when it is swapped in. We also propose a feedback mechanism that greatly reduces memory traffic incurred by our prefetching scheme. Experiments show significant speedup over baseline architectures with and without traditional prefetching in the presence of frequent context switches.
机译:在上下文切换的各种成本中,它对L2缓存的性能的影响是最重要的,因为它导致的高错过罚款。为了减少频繁的上下文交换机的影响,我们建议通过预取入L2缓存来恢复程序的局部性,该数据在交换之前使用程序使用的数据。全局历史列表用于记录LRU订单中的进程'L2读取访问。当进程换出并加载时,这些访问与进程的上下文一起保存,以指导预取。我们还提出了一种反馈机制,大大减少了我们的预取方案所产生的内存流量。实验显示在存在频繁上下文交换机的情况下,在基线架构上显示出基线架构的显着加速。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号