首页> 外文会议> >Learning lazy naive Bayesian classifiers for ranking
【24h】

Learning lazy naive Bayesian classifiers for ranking

机译:学习惰性朴素贝叶斯分类器进行排名

获取原文

摘要

Naive Bayes (simply NB) has been well-known as an effective and efficient classification algorithm. However, it is based on the conditional independence assumption that it is often violated in applications. In addition, in many real-world data mining applications, however, an accurate ranking of instances is often required rather than an accurate classification. For example, a ranking of customers in terms of the likelihood that they buy one's products is useful in direct marketing. In this paper, we firstly investigate the ranking performance of some lazy learning algorithms for extending naive Bayes. The ranking performance is measured by Hand and Till (2001) and Bradley (1997). We observe that they can not significantly improve naive Bayes' ranking performance. Motivated by this fact and aiming at improving naive Bayes with accurate ranking, we present a new lazy learning algorithm, called lazy naive Bayes (simply LNB), to extend naive Bayes for ranking. We experimentally tested our algorithm, using the whole 36 UCI data sets (Blake and Merz, 2000) recommended by Weka, and compared it to NB and C4.4 (Provost and Domingos, 2003) measured by AUC. The experimental results show that our algorithm significantly outperforms both NB and C4.4.
机译:朴素贝叶斯(National Bayes,简称NB)是众所周知的一种有效的分类算法。但是,它基于条件独立性假设,即在应用程序中经常会违反它。另外,在许多现实世界的数据挖掘应用程序中,通常需要对实例进行准确的排名,而不是准确的分类。例如,在顾客购买产品的可能性方面对顾客进行排名对于直接营销很有用。在本文中,我们首先研究了一些用于扩展朴素贝叶斯的惰性学习算法的排名性能。排名表现由Hand and Till(2001)和Bradley(1997)衡量。我们观察到它们不能显着提高朴素贝叶斯的排名表现。出于这一事实的动机,旨在提高准确排名的朴素贝叶斯,我们提出了一种新的惰性学习算法,称为惰性朴素贝叶斯(简称LNB),以扩展朴素贝叶斯的排名。我们使用Weka推荐的全部36个UCI数据集(Blake和Merz,2000年)对算法进行了实验测试,并将其与AUC测量的NB和C4.4(Provost和Domingos,2003年)进行了比较。实验结果表明,我们的算法明显优于NB和C4.4。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号