首页> 外文会议>International Symposium on Embedded Computing and System Design >A Memristive Activation Circuit for Deep Learning Neural Networks
【24h】

A Memristive Activation Circuit for Deep Learning Neural Networks

机译:深度学习神经网络的忆阻激活电路

获取原文

摘要

A highly efficient memristor MIN function based activation circuit is presented for memristive neuromorphic systems, using only two memristors and a comparator. The ReLU activation function is approximated using this circuit for the first time. The ReLU activation function helps to significantly reduce the time and computational cost of training in neuromorphic systems due to its simplicity and effectiveness in deep neural networks. A multilayer neural network is simulated using this activation circuit in addition to traditional memristor crossbar arrays. The results illustrate that the proposed circuit is able to perform training effectively with significant savings in time and area in memristor crossbar based neural networks.
机译:仅使用两个忆阻器和比较器的椎间体神经晶体系统呈现高效的映射器Min功能基于激活电路。第一次使用该电路近似Relu激活函数。由于深度神经网络中的简单性和有效性,Relu激活功能有助于显着降低神经族系统中训练的时间和计算成本。除了传统的Memristor Crossbar阵列之外,使用该激活电路模拟多层神经网络。结果说明所提出的电路能够在基于Memristor横杆的神经网络中的时间和区域中有效地节省有效地进行训练。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号