首页> 外文会议>International Conference on Speech and Computer >Towards Network Simplification for Low-Cost Devices by Removing Synapses
【24h】

Towards Network Simplification for Low-Cost Devices by Removing Synapses

机译:通过去除突触来朝着网络简化为低成本设备

获取原文

摘要

The deployment of robust neural network based models on low-cost devices touches the problem with hardware constraints like limited memory footprint and computing power. This work presents a general method for a rapid reduction of parameters (80-90%) in a trained (DNN or LSTM) network by removing its redundant synapses, while the classification accuracy is not significantly hurt. The massive reduction of parameters leads to a notable decrease of the model's size and the actual prediction time of on-board classifiers. We show the pruning results on a simple speech recognition task, however, the method is applicable to any classification data.
机译:在低成本设备上部​​署基于鲁棒的神经网络基础模型触及硬件限制等问题,如有限的存储占地面积和计算能力。这项工作通过去除其冗余突触,介绍了培训(DNN或LSTM)网络中的参数(80-90%)的一般方法,而分类精度不会显着受到伤害。参数的大幅降低导致模型尺寸的显着降低和板载分类器的实际预测时间。我们在简单的语音识别任务中显示了修剪结果,但是该方法适用于任何分类数据。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号