...
首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank
【24h】

Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank

机译:具有低排位权重矩阵的神经网络的理论性质

获取原文

摘要

Recently low displacement rank (LDR) matrices, or so-called structured matrices, have been proposed to compress large-scale neural networks. Empirical results have shown that neural networks with weight matrices of LDR matrices, referred as LDR neural networks, can achieve significant reduction in space and computational complexity while retaining high accuracy. This paper gives theoretical study on LDR neural networks. First, we prove the universal approximation property of LDR neural networks with a mild condition on the displacement operators. We then show that the error bounds of LDR neural networks are as efficient as general neural networks with both single-layer and multiple-layer structure. Finally, we propose back-propagation based training algorithm for general LDR neural networks.
机译:最近,提出了低位移秩(LDR)矩阵或所谓的结构化矩阵来压缩大规模神经网络。实验结果表明,具有LDR矩阵权重矩阵的神经网络,称为LDR神经网络,可以在保持高精度的同时,显着减少空间和计算复杂度。本文对LDR神经网络进行了理论研究。首先,我们证明了在位移算子上具有适度条件的LDR神经网络的通用逼近性质。然后,我们证明LDR神经网络的误差范围与具有单层和多层结构的通用神经网络一样有效。最后,我们提出了基于反向传播的通用LDR神经网络训练算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号