【24h】

Experiments with an ensemble self-generating neural network

机译:用集合自发作神经网络进行实验

获取原文

摘要

In an earlier paper, we introduced an ensemble model called ESGNN (ensemble self-generating neural network) which can be used to reduce the error for classification and chaotic time series prediction. Although this model can obtain the high accuracy than a single SGNN, the computational cost increase in proportion to the number of SGNN in an ensemble. In this paper, we propose a new pruning SGNN algorithm to reduce the memory requirement for classification. We compared ESGNN with nearest neighbor classifier using a collection of machine-learning benchmarks. Experimental results show that our method could reduce the memory requirement and improve the accuracy over the nearest neighbor classifier's accuracy.
机译:在早期的纸张中,我们介绍了一个名为ESGNN(集合自发作神经网络)的集合模型,该模型可用于减少分类和混沌时间序列预测的误差。虽然该模型可以获得比单个SGNN高精度,但是计算成本与集合中的SGNN数量成比例地增加。在本文中,我们提出了一种新的修剪SGNN算法,以减少分类的内存要求。我们使用一系列机器学习基准与最近的邻分类器进行比较ESGNN。实验结果表明,我们的方法可以降低内存要求,提高最近邻邻分类器的准确性的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号