首页> 外文期刊>IEEE transactions on mobile computing >Deep Reinforcement Learning for Online Computation Offloading in Wireless Powered Mobile-Edge Computing Networks
【24h】

Deep Reinforcement Learning for Online Computation Offloading in Wireless Powered Mobile-Edge Computing Networks

机译:在无线动力移动边缘计算网络中在线计算卸载的深度加强学习

获取原文
获取原文并翻译 | 示例

摘要

Wireless powered mobile-edge computing (MEC) has recently emerged as a promising paradigm to enhance the data processing capability of low-power networks, such as wireless sensor networks and internet of things (IoT). In this paper, we consider a wireless powered MEC network that adopts a binary offloading policy, so that each computation task of wireless devices (WDs) is either executed locally or fully offloaded to an MEC server. Our goal is to acquire an online algorithm that optimally adapts task offloading decisions and wireless resource allocations to the time-varying wireless channel conditions. This requires quickly solving hard combinatorial optimization problems within the channel coherence time, which is hardly achievable with conventional numerical optimization methods. To tackle this problem, we propose a Deep Reinforcement learning-based Online Offloading (DROO) framework that implements a deep neural network as a scalable solution that learns the binary offloading decisions from the experience. It eliminates the need of solving combinatorial optimization problems, and thus greatly reduces the computational complexity especially in large-size networks. To further reduce the complexity, we propose an adaptive procedure that automatically adjusts the parameters of the DROO algorithm on the fly. Numerical results show that the proposed algorithm can achieve near-optimal performance while significantly decreasing the computation time by more than an order of magnitude compared with existing optimization methods. For example, the CPU execution latency of DROO is less than 0.1 second in a 30-user network, making real-time and optimal offloading truly viable even in a fast fading environment.
机译:最近出现了无线动力移动边缘计算(MEC)作为提高低功率网络的数据处理能力,例如无线传感器网络和物联网(物联网)。在本文中,我们考虑一种采用二进制卸载策略的无线供电的MEC网络,从而使无线设备(WDS)的每个计算任务在本地或完全卸载到MEC服务器。我们的目标是获得一个在线算法,最佳地适应任务卸载决策和无线资源分配给时变无线信道条件。这需要快速解决信道相干时间内的硬组合优化问题,这与传统的数值优化方法几乎无法实现。为了解决这个问题,我们提出了一个基于深度的加强学习的在线卸载(DROO)框架,该框架将深度神经网络实现为可扩展的解决方案,从而可以从体验中学习二进制卸载决策。它消除了解决组合优化问题的需要,从而大大降低了尤其是在大型网络中的计算复杂性。为了进一步降低复杂性,我们提出了一种自动调整DROO算法的参数的自适应过程。数值结果表明,与现有的优化方法相比,所提出的算法可以实现近最佳性能,同时显着降低计算时间超过一个数量级。例如,在30用户网络中,DROO的CPU执行延迟小于0.1秒,即使在快速衰落环境中也可以实时和最佳的卸载真正的卸载。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号