首页> 中文期刊>模式识别与人工智能 >多类类别不平衡学习算法:EasyEnsemble . M

多类类别不平衡学习算法:EasyEnsemble . M

     

摘要

The potential useful information in the majority class is ignored by stochastic under-sampling. When under-sampling is applied to multi-class imbalance problem, this situation becomes even worse. In this paper, EasyEnsemble. M for multi-class imbalance problem is proposed. The potential useful information contained in the majority classes which is ignored is explored by stochastic sampling the majority classes for multiple times. Then, sub-classifiers are learned and a strong classifier is obtained by using hybrid ensemble techniques. Experimental results show that EasyEnsemble. M is superior to other frequently used multi-class imbalance learning methods when G-mean is used as performance measure.%随机欠采样方法忽略潜在有用的大类样本信息,在面对多类分类问题时更为突出.文中提出多类类别不平衡学习算法:EasyEnsemble. M.该算法通过多次针对大类样本随机采样,充分利用被随机欠采样方法忽略的潜在有用的大类样本,学习多个子分类器,利用混合的集成技术最终得到性能较优的强分类器.实验结果表明,与常用的多类类别不平衡学习算法相比,EasyEnsemble. M可有效提高分类器的G-mean值.

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号