首页> 外文期刊>IEEE transactions on very large scale integration (VLSI) systems >Efficient VLSI Implementation of Neural Networks With Hyperbolic Tangent Activation Function
【24h】

Efficient VLSI Implementation of Neural Networks With Hyperbolic Tangent Activation Function

机译:具有双曲正切激活函数的神经网络的高效VLSI实现

获取原文
获取原文并翻译 | 示例
       

摘要

Nonlinear activation function is one of the main building blocks of artificial neural networks. Hyperbolic tangent and sigmoid are the most used nonlinear activation functions. Accurate implementation of these transfer functions in digital networks faces certain challenges. In this paper, an efficient approximation scheme for hyperbolic tangent function is proposed. The approximation is based on a mathematical analysis considering the maximum allowable error as design parameter. Hardware implementation of the proposed approximation scheme is presented, which shows that the proposed structure compares favorably with previous architectures in terms of area and delay. The proposed structure requires less output bits for the same maximum allowable error when compared to the state-of-the-art. The number of output bits of the activation function determines the bit width of multipliers and adders in the network. Therefore, the proposed activation function results in reduction in area, delay, and power in VLSI implementation of artificial neural networks with hyperbolic tangent activation function.
机译:非线性激活函数是人工神经网络的主要组成部分之一。双曲正切和S形是最常用的非线性激活函数。在数字网络中准确实现这些传递函数面临着某些挑战。本文提出了一种有效的双曲正切函数逼近方案。近似值基于将最大允许误差作为设计参数的数学分析。提出了所提出的近似方案的硬件实现,这表明所提出的结构在面积和延迟方面与以前的体系结构相比具有优势。与现有技术相比,对于相同的最大允许误差,提出的结构需要较少的输出位。激活功能的输出位数决定了网络中乘法器和加法器的位宽。因此,提出的激活函数可在具有双曲正切激活函数的人工神经网络的VLSI实现中减小面积,减少延迟并降低功耗。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号