【24h】

Parameter Estimation of One-Class SVM on Imbalance Text Classification

机译:基于不平衡文本分类的一类支持向量机的参数估计

获取原文
获取原文并翻译 | 示例

摘要

Compared with conventional two-class learning schemes, one-class classification simply uses a single class for training purposes. Applying one-class classification to the minorities in an imbalanced data has been shown to achieve better performance than the two-class one. In this paper, in order to make the best use of all the available information during the learning procedure, we propose a general framework which first uses the minority class for training in the one-class classification stage; and then uses both minority and majority class for estimating the generalization performance of the constructed classifier. Based upon this generalization performance measurement, parameter search algorithm selects the best parameter settings for this classifier. Experiments on UCI and Reuters text data show that one-class SVM embedded in this framework achieves much better performance than the standard one-class SVM alone and other learning schemes, such as one-class Naive Bayes, one-class nearest neighbour and neural network.
机译:与传统的两级学习方案相比,一类分类仅将单个类用于培训目的。在不平衡数据中对少数群体应用一类分类已显示出比两类分类更好的性能。在本文中,为了在学习过程中充分利用所有可用信息,我们提出了一个通用框架,该框架首先在一类分类阶段中使用少数派进行训练;然后使用少数类和多数类来估计构造的分类器的泛化性能。基于此泛化性能度量,参数搜索算法为此分类器选择最佳参数设置。在UCI和路透社文本数据上进行的实验表明,与单独的标准一类SVM和其他学习方案(例如一类朴素贝叶斯,一类最近邻居和神经网络)相比,嵌入到该框架中的一类SVM的性能要好得多。 。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号