首页> 外国专利> QUANTIZING TRAINED LONG SHORT-TERM MEMORY NEURAL NETWORKS

QUANTIZING TRAINED LONG SHORT-TERM MEMORY NEURAL NETWORKS

机译:量化经过训练的长期短期记忆神经网络

摘要

Method for quantizing a trained long short-term memory (LSTM) neural network having a plurality of weights, the method comprising: obtaining data specifying trained floating-point values for each of the weights of the trained LSTM neural network, the trained LSTM neural network comprising one or more LSTM layers, each LSTM layer having a plurality of gates and each of the plurality of gates being associated with an input weight matrix and a recurrent weight matrix; quantizing the trained LSTM neural network, comprising: for each gate, quantizing the elements of the input weight matrix to a target fixed bit-width; for each gate, quantizing the elements of the recurrent weight matrix to the target fixed bit-width; and providing data specifying a quantized LSTM neural network for use in performing quantized inference.
机译:用于量化具有多个权重的训练后的长期短期记忆(LSTM)神经网络的方法,该方法包括:获得指定训练后的LSTM神经网络的每个权重的数据,该数据指定训练后的浮点值,训练后的LSTM神经网络包括一个或多个LSTM层,每个LSTM层具有多个门,并且多个门中的每个与输入权重矩阵和循环权重矩阵相关联;量化训练后的LSTM神经网络,包括:对于每个门,将输入权重矩阵的元素量化为目标固定位宽;对于每个门,将递归权重矩阵的元素量化为目标固定位宽;提供指定量化的LSTM神经网络以执行量化推理的数据。

著录项

  • 公开/公告号WO2020092532A1

    专利类型

  • 公开/公告日2020-05-07

    原文格式PDF

  • 申请/专利权人 GOOGLE LLC;

    申请/专利号WO2019US58821

  • 发明设计人 GUEVARA RAZIEL ALVAREZ;

    申请日2019-10-30

  • 分类号G06N3/04;G06N3/063;

  • 国家 WO

  • 入库时间 2022-08-21 11:11:18

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号