首页> 外文OA文献 >Building an Ensemble of Fine-Tuned Naive Bayesian Classifiers for Text Classification
【2h】

Building an Ensemble of Fine-Tuned Naive Bayesian Classifiers for Text Classification

机译:建立一个微调朴素贝叶斯分类器的集合,用于文本分类

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Text classification is one domain in which the naive Bayesian (NB) learning algorithm performs remarkably well. However, making further improvement in performance using ensemble-building techniques proved to be a challenge because NB is a stable algorithm. This work shows that, while an ensemble of NB classifiers achieves little or no improvement in terms of classification accuracy, an ensemble of fine-tuned NB classifiers can achieve a remarkable improvement in accuracy. We propose a fine-tuning algorithm for text classification that is both more accurate and less stable than the NB algorithm and the fine-tuning NB (FTNB) algorithm. This improvement makes it more suitable than the FTNB algorithm for building ensembles of classifiers using bagging. Our empirical experiments, using 16-benchmark text-classification data sets, show significant improvement for most data sets.
机译:文本分类是一个域,其中Naive Bayesian(NB)学习算法非常良好地执行。然而,通过被证明的集合建筑技术进一步提高性能是挑战,因为NB是稳定的算法。这项工作表明,虽然Nb分类器的集合在分类精度方面达到很少或没有改进,但微调的NB分类器的集合可以实现显着提高。我们提出了一种微调算法,用于文本分类,这些算法比NB算法和微调NB(FTNB)算法更准确且较低。这种改进使得它比使用袋装建立分类器的整合的FTNB算法更合适。我们的经验实验,使用16基准文本分类数据集,显示大多数数据集的显着改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号