首页> 外文会议>IEEE Annual International Symposium on Personal, Indoor, and Mobile Radio Communications >Activation Functions of Deep Neural Networks for Polar Decoding Applications
【24h】

Activation Functions of Deep Neural Networks for Polar Decoding Applications

机译:极性解码应用的深神经网络的激活功能

获取原文

摘要

Among various deep neural network (DNN) components, this paper studies the activation functions especially for deep feed-forward networks with applications to channel decoding problems of polar code. In line with our previous study, this paper considers the ReLU (Rectified Linear Unit) and its variants for activation functions of DNN. We devise a new ReLU variant, called Sloped ReLU, by varying the slope of the ReLU for the positive domain range. This is analogous to tree architectures between the likelihood function in successive decoding of channel codes and the activation function in DNN. Our numerical results show that the polar decoding performance with the Sloped ReLU improves as the slope increases, up to a certain level. We believe that the idea of utilizing this analogy for determining activation functions of DNN can be applied to other decoding problems as well, which remains as a future work.
机译:在各种深度神经网络(DNN)组件中,本文研究了激活功能,特别是对于具有应用来信道解码极地代码的解码问题的深度前馈网络。符合我们以前的研究,本文考虑了REU(整流线性单元)及其用于DNN激活功能的变体。我们通过改变对正畴范围的Relu的斜率来设计一个被称为倾斜Relu的新Relu变体。这类似于在DNN中连续解码的似然函数与DNN中的激活函数之间的似然函数之间的树架构。我们的数值结果表明,随着斜率的增加,倾斜释放的极性解码性能提高,直至一定级别。我们认为,利用这种类比确定DNN的激活功能的想法也可以应用于其他解码问题,这仍然是未来的工作。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号