首页> 外国专利> REDUCING DYNAMIC RANGE OF LOW-RANK DECOMPOSITION MATRICES

REDUCING DYNAMIC RANGE OF LOW-RANK DECOMPOSITION MATRICES

机译:降低低秩分解矩阵的动态范围

摘要

Features are disclosed for reducing the dynamic range of an approximated trained artificial neural network weight matrix in an automatic speech recognition system. The weight matrix may be approximated as two low-rank matrices using a decomposition technique. This approximation technique may insert an additional layer between the two original layers connected by the weight matrix. The dynamic range of the low-rank decomposition may be reduced by applying the square root of singular values, combining them with both low-rank matrices, and utilizing a random rotation matrix to further compress the low-rank matrices. Reduction of dynamic range may make fixed point scoring more effective due to smaller quantization error, as well as make the neural network system more favorable for retraining after approximating a neural network weight matrix. Features are also disclosed for adjusting the learning rate during retraining to account for the low-rank approximations.
机译:公开了用于减小自动语音识别系统中的近似训练的人工神经网络权重矩阵的动态范围的特征。可以使用分解技术将权重矩阵近似为两个低秩矩阵。该近似技术可以在由权重矩阵连接的两个原始层之间插入附加层。可以通过应用奇异值的平方根,将它们与两个低秩矩阵组合以及使用随机旋转矩阵进一步压缩低秩矩阵来降低低秩分解的动态范围。动态范围的减小可能会由于较小的量化误差而使定点得分更有效,并使神经网络系统在近似神经网络权重矩阵后更适合于再训练。还公开了用于在再训练期间调节学习率以解决低秩近似的特征。

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号