首页> 中文期刊> 《计算机应用》 >在逐渐缩小的空间上渐进学习朴素贝叶斯参数

在逐渐缩小的空间上渐进学习朴素贝叶斯参数

             

摘要

Locally Weighted Naive Bayes (LWNB) is a good improvement of Naive Bayes ( NB) and Discriminative Frequency Estimate (DFE) remarkably improves the generalization accuracy of Naive Bayes. Inspired by LWNB and DFE, this paper proposed Gradually Contracting Spaces (GCS) algorithm to learn parameters of Naive Bayes. Given a test instance, GCS found a series of subspaces in global space which contained all training instances. All of these subspaces contained the test instance and any of them must be contained by others that are bigger than it. Then GCS used training instances contained in those subspaces to gradually learn parameters of Naive Bayes ( NB) by Modified version of DFE ( MDFE) which was a modified version of DFE and used NB to classify test instances. GSC trained Naive Bayes with all training data and achieved an eager version, which was the essential difference between GSC and LWNB. Decision tree version of GCS named GCS-T was implemented in this paper. The experimental results show that GCS-T has higher generalization accuracy compared with C4. 5 and some Bayesian classification algorithms such as Naive Bayes, BaysianNet, NBTree, Hidden Naive Bayes ( HNB), LWNB, and the classification speed of GCS-T is remarkably faster than LWNB.%局部加权朴素贝叶斯(LWNB)是朴素贝叶斯(NB)的一种较好的改进,判别频率估计(DFE)可以极大地提高NB的泛化正确率.受LWNB和DFE启发,提出逐渐缩小空间(GCS)算法用来学习NB参数:对于一个测试实例,寻找包含全体训练实例的全局空间的一系列逐渐缩小的子空间.这些子空间具有两种性质:1)它们都包含测试实例;2)一个空间一定包含在任何一个比它大的空间中.在逐渐缩小的空间上使用修改的DFE(MDFE)算法渐进地学习NB的参数,然后使用NB分类测试实例.与LWNB的根本不同是:GCS使用全体训练实例学习NB并且GCS可以实现为非懒惰版本.实现了GCS的决策树版本(GCS-T)实验结果显示,与c4.5以及贝叶斯分类算法(如NaiveBayes、BaysianNet、NBTree、LWNB、隐朴素贝叶斯)相比,GCS-T具有较高的泛化正确率,并且GCS-T的分类速度明显快于LWNB.

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号