首页> 外文期刊>International Journal of Computers & Applications >IMPROVING NAIVE BAYES FOR CLASSIFICATION
【24h】

IMPROVING NAIVE BAYES FOR CLASSIFICATION

机译:改进幼稚的分类

获取原文
获取原文并翻译 | 示例
       

摘要

Naive Bayes (NB) is one of the widely used algorithms for classification. However, its conditional independence assumption harms its performance to some extent. Thus, many algorithms are presented to improve its classification accuracy. In this paper, we single out another two improved algorithms: instance weighted naive Bayes (IWNB) and combined neighbourhood naive Bayes (CNNB). In IWNB, each training instance is firstly weighted according to the similarity between it and the mode of the training instances, and then a NB classifier is built on the weighted training instances. In CNNB, multiple NB are firstly built on multiple neighbourhoods with different radius values for a test instance, and then their class probability estimates are averaged to estimate the class probability of the test instance. We experimentally tested IWNB and CNNB using the whole 36 University of California, Irvine (UCI) data sets selected by Weka, and compared them with NB. The experimental results show that IWNB and CNNB all significantly outperform NB.
机译:朴素贝叶斯(NB)是广泛用于分类的算法之一。但是,其条件独立性假设会在一定程度上损害其性能。因此,提出了许多算法来提高其分类精度。在本文中,我们提出了另外两种改进的算法:实例加权朴素贝叶斯(IWNB)和组合邻域朴素贝叶斯(CNNB)。在IWNB中,首先根据训练实例与训练实例的相似度对每个训练实例进行加权,然后在加权的训练实例上建立NB分类器。在CNNB中,首先将多个NB构建在一个具有不同半径值的测试实例的多个邻域上,然后对它们的类​​别概率估计值求平均值,以估计测试实例的类别概率。我们使用由Weka选择的全部36个加利福尼亚大学欧文分校(UCI)数据集对IWNB和CNNB进行了实验测试,并将其与NB进行了比较。实验结果表明,IWNB和CNNB均明显优于NB。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号