【24h】

Lazy Averaged One-Dependence Estimators

机译:懒惰的平均一依赖估计量

获取原文
获取原文并翻译 | 示例

摘要

Naive Bayes is a probability-based classification model based on the conditional independence assumption. In many real-world applications, however, this assumption is often violated. Responding to this fact, researchers have made a substantial amount of effort to improve the accuracy of naive Bayes by weakening the conditional independence assumption. The most recent work is the Averaged One-Dependence Estimators (AODE) that demonstrates good classification performance. In this paper, we propose a novel lazy learning algorithm Lazy Averaged One-Dependence Estimators, simply LAODE, by extending AODE. For a given test instance, LAODE firstly expands the training data by adding some copies (clones) of each training instance according to its similarity to the test instance, and then uses the expanded training data to build an AODE classifier to classify the test instance. We experimentally test our algorithm in Weka system, using the whole 36 UCI data sets recommended by Weka, and compare it to naive Bayes, AODE, and LBR. The experimental results show that LAODE significantly outperforms all the other algorithms used to compare.
机译:朴素贝叶斯是基于条件独立性假设的基于概率的分类模型。但是,在许多实际应用中,经常会违反此假设。针对这一事实,研究人员已通过削弱条件独立性假设进行了大量工作,以提高朴素贝叶斯的准确性。最近的工作是证明了良好分类性能的平均一依赖估计量(AODE)。在本文中,我们通过扩展AODE提出了一种新颖的懒惰学习算法Lazy Averaged-Dependent Estimators,简称LAODE。对于给定的测试实例,LAODE首先通过根据每个训练实例与测试实例的相似性添加一些副本(克隆)来扩展训练数据,然后使用扩展的训练数据构建AODE分类器对测试实例进行分类。我们使用Weka推荐的全部36个UCI数据集在Weka系统中实验性地测试了我们的算法,并将其与朴素的贝叶斯,AODE和LBR进行了比较。实验结果表明,LAODE明显优于所有其他用于比较的算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号