首页> 外文会议>International Conference on speech and computer >Towards Network Simplification for Low-Cost Devices by Removing Synapses
【24h】

Towards Network Simplification for Low-Cost Devices by Removing Synapses

机译:通过删除突触来实现低成本设备的网络简化

获取原文

摘要

The deployment of robust neural network based models on low-cost devices touches the problem with hardware constraints like limited memory footprint and computing power. This work presents a general method for a rapid reduction of parameters (80-90%) in a trained (DNN or LSTM) network by removing its redundant synapses, while the classification accuracy is not significantly hurt. The massive reduction of parameters leads to a notable decrease of the model's size and the actual prediction time of on-board classifiers. We show the pruning results on a simple speech recognition task, however, the method is applicable to any classification data.
机译:在低成本设备上部​​署基于健壮神经网络的模型会遇到硬件限制(例如有限的内存占用和计算能力)带来的问题。这项工作提出了一种通用方法,该方法通过删除冗余的突触来在经过训练的(DNN或LSTM)网络中快速减少参数(80-90%),而不会明显损害分类准确性。参数的大量减少导致模型大小和车载分类器的实际预测时间显着减少。我们在一个简单的语音识别任务上显示了修剪结果,但是,该方法适用于任何分类数据。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号