...
首页> 外文期刊>Nanotechnology >Self-controlled multilevel writing architecture for fast training in neuromorphic RRAM applications
【24h】

Self-controlled multilevel writing architecture for fast training in neuromorphic RRAM applications

机译:神经形态RRAM应用中快速训练的自控多级写架构

获取原文
获取原文并翻译 | 示例
           

摘要

Memristor crossbar arrays naturally accelerate neural networks applications by carrying out parallel multiply-add operations. Due to the abrupt SET operation characterizing most RRAM devices, on-chip training usually requires either from iterative write/read stages, large and variation-sensitive circuitry, or both, to achieve multilevel capabilities. This paper presents a self-controlled architecture to program multilevel devices with a short and fixed operation duration. We rely on an ad hoc scheme to self-control the abrupt SET, choking the writing stimulus as the cell addresses the desired level. To achieve this goal, we make use of the voltage divider concept by placing a variable resistive load in series with the target cell. We validated the proposal against thorough simulations using RRAM cells fitting extremely fast physical devices and a commercial 40 nm CMOS technology, both exhibiting variability. For every case the proposed architecture allowed progressive and almost-linear resistive levels in each 1T1R and 1R crossbars structures.
机译:Memristor CrossBar阵列通过执行并行乘法添加操作自然加速神经网络应用。由于突然的设置操作表征了大多数RRAM设备,片上训练通常需要迭代写入/读取阶段,大型和变化敏感电路,或两者来实现多级能力。本文介绍了一种自控架构,可通过短且固定的操作持续时间来编程多级设备。我们依靠临时方案来自我控制突然集,扼杀写入刺激,因为细胞解决所需的水平。为实现这一目标,我们通过将分压器概念与目标单元串联串联串联使用。我们使用RRAM细胞拟合极快的物理设备和商业40nm CMOS技术的RRAM细胞来验证了对彻底模拟的提议,均表现出可变性。对于每种情况,所提出的架构允许在每个1T1R和1R横杆结构中允许逐行和近线电阻电平。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号