【24h】

A novel weighted nearest neighbor ensemble classifier

机译:一种新颖的加权最近邻集合分类器

获取原文
获取原文并翻译 | 示例

摘要

Recent works have shown that combining several classifiers is an effective method to improve classification accuracy. Many ensemble approaches have been introduced such as bagging and boosting that have reduced the generalization error of different classifiers; however, these methods could not increase the performance of Nearest Neighbor (NN) classifier. In this paper, a novel weighted ensemble technique (WNNE) is presented for improving the performance of NN classifier. In fact, WNNE is a combination of several NN classifiers, which have different subsets of input feature set. The algorithm assigns a weight to each classifier, and uses a weighted vote mechanism among these classifiers to determine the output of ensemble. We evaluated the proposed method on several datasets from UCI Repository and compared with NN classifier and Random subspace method (RSM). The results show that our method outperforms these two approaches.
机译:最近的工作表明,组合多个分类器是提高分类精度的有效方法。已经引入了许多集成方法,例如装袋和增强,这些方法减少了不同分类器的泛化误差。但是,这些方法不能提高最近邻(NN)分类器的性能。在本文中,提出了一种新颖的加权集成技术(WNNE)来提高神经网络分类器的性能。实际上,WNNE是几个NN分类器的组合,这些分类器具有输入特征集的不同子集。该算法为每个分类器分配权重,并在这些分类器中使用加权投票机制来确定集合的输出。我们在UCI存储库的几个数据集中评估了该方法,并与NN分类器和随机子空间方法(RSM)进行了比较。结果表明,我们的方法优于这两种方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号