首页> 外文会议>The 2010 International Joint Conference on Neural Networks >Compressing a neural network classifier using a Volterra-Neural Network model
【24h】

Compressing a neural network classifier using a Volterra-Neural Network model

机译:使用Volterra-神经网络模型压缩神经网络分类器

获取原文

摘要

Model compression is a required task when slow and large models are used, for example, for classification, but there are transmissions, space, time or computing capabilities constraints that have to be fulfilled. Multilayer Perceptron (MLP) models have been traditionally used as classifiers. Depending on the problem, they may need a large number of parameters (neuron functions, weights and bias) to obtain an acceptable performance. This work proposes a technique to compress an MLP model preserving, at the same time, its classification performance, through the kernels of a Volterra series model. The Volterra kernels can be used to represent the information that a Neural Network (NN) model has learnt with almost the same accuracy but compressed into less parameters. The Volterra-NN approach proposed in this work has two parts. First of all, it allows extracting the Volterra kernels from the NN parameters after training, which will contain the classifier knowledge. Second, it allows building different orders Volterra series model for the original problem using the Volterra kernels, significantly reducing the number of neural parameters involved to a very few Volterra-NN parameters (kernels). Experimental results are presented over the standard Iris classification problem, showing the good Volterra-NN model compression capabilities.
机译:模型压缩是使用慢速和大型模型的必要任务,例如,用于分类,但是必须满足的传输,空间,时间或计算能力约束。多层erceptron(MLP)模型传统上用作分类器。根据问题,他们可能需要大量参数(神经元函数,权重和偏置)来获得可接受的性能。这项工作提出了一种通过Volterra系列模型的内核进行压缩的技术来压缩MLP模型保留的技术,其分类性能。 Volterra内核可用于代表神经网络(NN)模型已经学习的信息,几乎相同的准确性,而是压缩成较少的参数。本作工作中提出的Volterra-Nn方法有两部分。首先,它允许在训练后从NN参数中提取Volterra内核,这将包含分类器知识。其次,它允许使用Volterra内核构建不同订单Volterra系列模型,从而大大减少了涉及的神经参数的数量,涉及极少的Volterra-NN参数(核)。在标准IRIS分类问题上提出了实验结果,显示了良好的Volterra-NN模型压缩能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号