首页> 外文会议>International Conference on Computing Communication Control and Automation >Efficient Classification with an Improved Nearest Neighbor Algorithm
【24h】

Efficient Classification with an Improved Nearest Neighbor Algorithm

机译:利用改进的最近邻算法进行高效分类

获取原文

摘要

k Nearest Neighbor is one of the best ten data mining algorithm because of its ease of understand, and great classification performance. In past kNN methods, it takes predefine fixed k-value for all test samples or take different ideal k-value for every test sample. However, in real application, applying kNN method with fixed k-value is unrealistic. Also, kNN methods have different k-values for every test sample take additional time. Thus, Past kNN methods have some limitations. To overcome the issues of kNN methods, in this paper, we work on new kNN methods KTree and K*Tree. In KTree method training stage is added to past kNN method for fast learning an ideal k-value for every test sample. So, KTree method takes less time than past methods, and enhances the accuracy in result. Also, we work on K*Tree method, which is faster than even KTree method. Further we implement kNN with soft clustering, and compared this result with previous kNN methods.
机译:K最近邻居是最好的十个数据挖掘算法之一,因为它的易于理解和巨大的分类性能。在过去的KNN方法中,所有测试样本需要预定定义的k值,或者对每个测试样品采用不同的理想k值。但是,在实际应用中,使用固定k值的kNN方法是不现实的。此外,KNN方法对每个测试样品具有不同的k值,需要额外的时间。因此,过去的KNN方法有一些局限性。为了克服KNN方法的问题,在本文中,我们在新的KNN方法中工作Ktree和K *树。在Ktree方法中,培训阶段被添加到过去的KNN方法中,以快速学习每个测试样品的理想k值。因此,Ktree方法比过去的方法更少时间,并提高结果的准确性。此外,我们在K *树方法上工作,这比甚至Ktree方法更快。此外,我们实现了具有软聚类的KNN,并使用先前的KNN方法比较了这一结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号