...
首页> 外文期刊>IEEE Network >Machine-Learning-Aided Optical Fiber Communication System
【24h】

Machine-Learning-Aided Optical Fiber Communication System

机译:机器学习辅助光纤通信系统

获取原文
获取原文并翻译 | 示例

摘要

The fiber optical network offers high speed, large bandwidth, and a high degree of reliability. However, the development of optical communication technology has hit a bottleneck due to several challenges such as energy loss, cost, and system capacity approaching the Shannon limit. As a powerful tool, machine learning technology provides a strong driving force for the development of various industries and greatly promotes the development of society. Machine learning also provides a new possible solution to achieve greater transmission capacities and longer transmission distances in optical communications. In this article, we introduce the application of machine learning in optical communication network systems. Three use cases are presented to evaluate the feasibility of our proposed architecture. In the transmission layer, the principal-component-based phase estimation algorithm is used for phase noise recovery in coherent optical systems, and the K-means algorithm is adopted to reduce the influence of nonlinear noise in probabilistic shaping systems. As for the network layer, the long short-term memory algorithm and the genetic algorithm are suitable for making traffic predictions and determining reasonable placement locations of remote radio heads in centralized radio access networks. Extensive simulations and experiments are conducted to evaluate the proposed algorithm in comparison to the state-of-the-art schemes. The results show the performance of three use cases. Machine learning algorithms applied to the transmission layer can greatly promote the performance of digital signal processing without increasing the complexity. Machine learning algorithms applied to the network layer can provide a more appropriate channel allocation plan in the era of high-speed communication. Ultimately, the intent of this article is to serve as a basis for stimulating more research in machine learning in optical communications.
机译:光纤光学网络提供高速,大带宽和高度的可靠性。然而,由于诸如能量损失,成本和系统容量的几个挑战,光学通信技术的发展已经达到了瓶颈,例如接近香农限制。作为一个强大的工具,机器学习技术为各种行业的发展提供了强大的推动力,大大促进了社会的发展。机器学习还提供了一种新的可能解决方案,以实现更大的传输容量和光通信中的传输距离。在本文中,我们介绍了机器学习在光通信网络系统中的应用。提出了三种用例评估了我们提出的架构的可行性。在透射层中,基于主组件的相位估计算法用于相干光学系统中的相位噪声恢复,采用K-MEAS算法来减少非线性噪声在概率整形系统中的影响。至于网络层,长短短期存储算法和遗传算法适用于在集中式无线电接入网络中制作交通预测和确定远程无线电头的合理放置位置。进行广泛的模拟和实验以评估所提出的算法与最先进的方案相比。结果显示了三种用例的性能。应用于传输层的机器学习算法可以大大推动数字信号处理的性能而不增加复杂性。应用于网络层的机器学习算法可以在高速通信时代提供更合适的信道分配计划。最终,本文的意图是作为刺激更多研究在光通信中的机器学习研究的基础。

著录项

  • 来源
    《IEEE Network》 |2021年第4期|136-142|共7页
  • 作者单位

    Beijing Univ Posts & Telecommun Beijing Peoples R China;

    Beijing Univ Posts & Telecommun Beijing Peoples R China;

    Beijing Univ Posts & Telecommun Beijing Peoples R China;

    Beijing Univ Posts & Telecommun Beijing Peoples R China;

    Beijing Univ Posts & Telecommun Beijing Peoples R China;

    Qatar Univ Comp Sci & Engn Doha Qatar;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号