首页> 外文期刊>Network Daily News >New Networks Study Results from Sungkyunkwan University Described (Adaptive Weight-bit Inversion for State Error Reduction for Robust and Efficient Deep Neural Network Inference Using Mlc Nand Flash)
【24h】

New Networks Study Results from Sungkyunkwan University Described (Adaptive Weight-bit Inversion for State Error Reduction for Robust and Efficient Deep Neural Network Inference Using Mlc Nand Flash)

机译:Sungkyunkwan大学的新网络研究结果描述(自适应重量位倒置,用于使用MLC NAND FLASH的强大而有效的深神经网络推断状态降低错误的状态错误)

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

By a News Reporter-Staff News Editor at Network Daily News - A new study on Networks is now available. According to news reporting from Suwon, South Korea, by NewsRx journalists, research stated, “When Flash memory is used to store the weights of a deep neural network (DNN), the inference accuracy can degrade owing to the state errors of the Flash memory. To protect the weights from state errors, the existing methods rely on an error correction code (ECC) or parity, which can incur power/storage overhead.”
机译:由Network Daily News的新闻记者播放器新闻编辑 - 现在可以提供一项有关网络的新研究。 根据NewsRX记者的新闻报道,研究说:“当使用闪存来存储深神经网络(DNN)的权重时,推理精度会因闪存的状态错误而降低 。 为了保护权重免受状态错误,现有方法依赖于错误校正代码(ECC)或奇偶校验,这可能会导致电源/存储开销。”

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号