首页> 外文会议>Computer Design, 2009. ICCD 2009 >Extending data prefetching to cope with context switch misses
【24h】

Extending data prefetching to cope with context switch misses

机译:扩展数据预取以应对上下文切换遗漏

获取原文

摘要

Among the various costs of a context switch, its impact on the performance of L2 caches is the most significant because of the resulting high miss penalty. To reduce the impact of frequent context switches, we propose restoring a program's locality by prefetching into the L2 cache the data a program was using before it was swapped out. A Global History List is used to record a process' L2 read accesses in LRU order. These accesses are saved along with the process' context when the process is swapped out and loaded to guide prefetching when it is swapped in. We also propose a feedback mechanism that greatly reduces memory traffic incurred by our prefetching scheme. Experiments show significant speedup over baseline architectures with and without traditional prefetching in the presence of frequent context switches.
机译:在上下文切换的各种成本中,其对L2缓存性能的影响最为显着,因为会导致较高的未命中损失。为了减少频繁的上下文切换的影响,我们建议通过在换出之前将程序正在使用的数据预取到L2缓存中来恢复程序的位置。全局历史记录列表用于按LRU顺序记录进程的L2读取访问。这些访问权在交换出进程时随进程上下文一起保存,并在交换进来时加载以指导预取。我们还提出了一种反馈机制,该机制可大大减少预取方案带来的内存流量。实验表明,在存在频繁的上下文切换的情况下,使用和不使用传统的预取,都比基准架构显着提高了速度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号