首页> 外文OA文献 >Exploring the Impact of Variability in Resistance Distributions of RRAM on the Prediction Accuracy of Deep Learning Neural Networks
【2h】

Exploring the Impact of Variability in Resistance Distributions of RRAM on the Prediction Accuracy of Deep Learning Neural Networks

机译:探索RRAM电阻分布变异性对深度学习神经网络预测准确性的影响

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

In this work, we explore the use of the resistive random access memory (RRAM) device as a synapse for mimicking the trained weights linking neurons in a deep learning neural network (DNN) (AlexNet). The RRAM devices were fabricated in-house and subjected to 1000 bipolar read-write cycles to measure the resistances recorded for Logic-0 and Logic-1 (we demonstrate the feasibility of achieving eight discrete resistance states in the same device depending on the RESET stop voltage). DNN simulations have been performed to compare the relative error between the output of AlexNet Layer 1 (Convolution) implemented with the standard backpropagation (BP) algorithm trained weights versus the weights that are encoded using the measured resistance distributions from RRAM. The IMAGENET dataset is used for classification purpose here. We focus only on the Layer 1 weights in the AlexNet framework with 11 × 11 × 96 filters values coded into a binary floating point and substituted with the RRAM resistance values corresponding to Logic-0 and Logic-1. The impact of variability in the resistance states of RRAM for the low and high resistance states on the accuracy of image classification is studied by formulating a look-up table (LUT) for the RRAM (from measured I-V data) and comparing the convolution computation output of AlexNet Layer 1 with the standard outputs from the BP-based pre-trained weights. This is one of the first studies dedicated to exploring the impact of RRAM device resistance variability on the prediction accuracy of a convolutional neural network (CNN) on an AlexNet platform through a framework that requires limited actual device switching test data.
机译:在这项工作中,我们探讨了使用电阻随机存取存储器(RRAM)设备作为模拟培训的权重在深度学习神经网络(DNN)(DNN)(AlexNet)中的训练权重。 RRAM设备在内部制造并进行1000双极读写周期,以测量为逻辑-0和逻辑-1记录的电阻(我们展示了根据复位停止在同一设备中实现八个离散阻力状态的可行性电压)。已经执行DNN仿真以比较用标准反向化(BP)算法实现的亚历网1(卷积)的输出之间的相对误差与使用来自RRAM的测量电阻分布进行编码的权重。 ImageNet DataSet此处用于分类目的。我们只关注亚历网框架中的第1层权重,其中11×11×96滤波器编码成二进制浮点并用对应于逻辑-0和逻辑-1的RRAM电阻值代替。通过制定RRAM(从测量的IV数据)并比较卷积计算输出来研究对图像分类的准确性的降低和高阻状态的阻力状态对图像分类的准确性的影响。亚历纳网层1具有来自基于BP的预训练重量的标准输出。这是探讨RRAM器件电阻可变性的第一项研究之一,通过框架通过需要有限的实际设备切换测试数据的框架来探索RRAM器件电阻可变性对亚历网格平台上的卷积神经网络(CNN)的预测精度之一。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号