首页> 外文期刊>IEEE transactions on wireless communications >Context-Aware Proactive Content Caching With Service Differentiation in Wireless Networks
【24h】

Context-Aware Proactive Content Caching With Service Differentiation in Wireless Networks

机译:无线网络中具有服务区分功能的上下文感知主动内容缓存

获取原文
获取原文并翻译 | 示例
           

摘要

Content caching in small base stations or wireless infostations is considered to be a suitable approach to improve the efficiency in wireless content delivery. Placing the optimal content into local caches is crucial due to storage limitations, but it requires knowledge about the content popularity distribution, which is often not available in advance. Moreover, local content popularity is subject to fluctuations, since mobile users with different interests connect to the caching entity over time. Which content a user prefers may depend on the user’s context. In this paper, we propose a novel algorithm for context-aware proactive caching. The algorithm learns context-specific content popularity online by regularly observing context information of connected users, updating the cache content and observing cache hits subsequently. We derive a sublinear regret bound, which characterizes the learning speed and proves that our algorithm converges to the optimal cache content placement strategy in terms of maximizing the number of cache hits. Furthermore, our algorithm supports service differentiation by allowing operators of caching entities to prioritize customer groups. Our numerical results confirm that our algorithm outperforms state-of-the-art algorithms in a real world data set, with an increase in the number of cache hits of at least 14%.
机译:小型基站或无线信息站中的内容缓存被认为是提高无线内容交付效率的合适方法。由于存储限制,将最佳内容放入本地缓存至关重要,但是它需要有关内容受欢迎程度分布的知识,而这通常是事先无法获得的。而且,本地内容的受欢迎程度会受到波动的影响,因为随着时间的流逝,具有不同兴趣的移动用户会连接到缓存实体。用户喜欢哪种内容可能取决于用户的上下文。在本文中,我们提出了一种用于上下文感知主动缓存的新算法。该算法通过定期观察已连接用户的上下文信息,更新缓存内容并随后观察缓存命中来在线学习特定于上下文的内容流行度。我们推导了一个次线性后悔边界,该边界表征了学习速度,并证明了我们的算法在最大化缓存命中数方面已收敛到最佳缓存内容放置策略。此外,我们的算法通过允许缓存实体的操作员确定客户组的优先级来支持服务差异化。我们的数值结果证实,我们的算法在现实世界数据集中表现优于最新算法,高速缓存命中次数至少增加了14%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号