首页> 外文会议>Canadian Society for Computational Studies of Intelligence Conference >Parameter Estimation of One-Class SVM on Imbalance Text Classification
【24h】

Parameter Estimation of One-Class SVM on Imbalance Text Classification

机译:单级SVM对不平衡文本分类的参数估计

获取原文

摘要

Compared with conventional two-class learning schemes, one-class classification simply uses a single class for training purposes. Applying one-class classification to the minorities in an imbalanced data has been shown to achieve better performance than the two-class one. In this paper, in order to make the best use of all the available information during the learning procedure, we propose a general framework which first uses the minority class for training in the one-class classification stage; and then uses both minority and majority class for estimating the generalization performance of the constructed classifier. Based upon this generalization performance measurement, parameter search algorithm selects the best parameter settings for this classifier. Experiments on UCI and Reuters text data show that one-class SVM embedded in this framework achieves much better performance than the standard one-class SVM alone and other learning schemes, such as one-class Naive Bayes, one-class nearest neighbour and neural network.
机译:与传统的两班学习计划相比,单级分类只是使用单个课程以进行培训。在不平衡数据中将单级分类应用于少数群体,从而实现比两级的更好的性能。在本文中,为了充分利用学习程序期间的所有可用信息,我们提出了一个一般框架,首先在单级分类阶段使用少数课程进行培训;然后使用少数群体和多数类来估算构造分类器的泛化性能。基于此泛化性能测量,参数搜索算法为此分类器选择最佳参数设置。 UCI和路透社文本数据的实验表明,在本框架中嵌入的单级SVM达到比单独标准的单级SVM和其他学习计划,例如一流的天真贝父,一流的邻居和神经网络等更好的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号