首页> 外文期刊>Selected Areas in Communications, IEEE Journal on >Energy-Efficiency Oriented Traffic Offloading in Wireless Networks: A Brief Survey and a Learning Approach for Heterogeneous Cellular Networks
【24h】

Energy-Efficiency Oriented Traffic Offloading in Wireless Networks: A Brief Survey and a Learning Approach for Heterogeneous Cellular Networks

机译:无线网络中面向能源效率的流量分流:异构蜂窝网络的简要调查和学习方法

获取原文
获取原文并翻译 | 示例
           

摘要

This paper first provides a brief survey on existing traffic offloading techniques in wireless networks. Particularly as a case study, we put forward an online reinforcement learning framework for the problem of traffic offloading in a stochastic heterogeneous cellular network (HCN), where the time-varying traffic in the network can be offloaded to nearby small cells. Our aim is to minimize the total discounted energy consumption of the HCN while maintaining the quality-of-service (QoS) experienced by mobile users. For each cell (i.e., a macro cell or a small cell), the energy consumption is determined by its system load, which is coupled with system loads in other cells due to the sharing over a common frequency band. We model the energy-aware traffic offloading problem in such HCNs as a discrete-time Markov decision process (DTMDP). Based on the traffic observations and the traffic offloading operations, the network controller gradually optimizes the traffic offloading strategy with no prior knowledge of the DTMDP statistics. Such a model-free learning framework is important, particularly when the state space is huge. In order to solve the curse of dimensionality, we design a centralized -learning with compact state representation algorithm, which is named -learning. Moreover, a decentralized version of the -learning is developed based on the fact the macro base stations (BSs) can independently manage the operations of local small-cell BSs through making use of the global network state information obtained from the network controller. Simulations are conducted to show the effectiveness of the derived centralized and decentralized -learning algorithms in balancing the tradeo- f between energy saving and QoS satisfaction.
机译:本文首先简要介绍了无线网络中现有的流量卸载技术。特别是作为一个案例研究,针对随机异质蜂窝网络(HCN)中的流量卸载问题,我们提出了一种在线强化学习框架,该网络可以将网络中随时间变化的流量卸载到附近的小型小区。我们的目标是在保持移动用户体验的服务质量(QoS)的同时,最大程度地减少HCN的总折价能耗。对于每个小区(即,宏小区或小型小区),能量消耗由其系统负载确定,由于在公共频带上的共享,该系统负载与其他小区中的系统负载耦合。我们在诸如离散时间马尔可夫决策过程(DTMDP)这样的HCN中对能源感知的流量卸载问题进行建模。基于流量观察和流量卸载操作,网络控制器无需DTMDP统计信息的先验知识即可逐步优化流量卸载策略。这种无模型的学习框架非常重要,尤其是在状态空间很大的情况下。为了解决维数的诅咒,我们设计了一种紧凑的状态表示算法,即集中式学习,称为-学习。此外,基于宏基站(BS)可以利用从网络控制器获得的全局网络状态信息来独立管理本地小小区BS的操作这一事实,开发了学习的分散版本。进行仿真以显示导出的集中式和分散式学习算法在平衡节能与QoS满意度之间的权衡方面的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号