【24h】

MUTE: Majority under-sampling technique

机译:MUTE:多数欠采样技术

获取原文
获取原文并翻译 | 示例

摘要

An application which operates on an imbalanced dataset loses its classification performance on a minority class, which is rare and important. There are a number of over-sampling techniques, which insert minority instances into a dataset, to adjust the class distribution. Unfortunately, these instances highly affect the computation of generating a classifier. In this paper, a new simple and effective under-sampling called MUTE is proposed. Its strategy is to get rid of noise majority instances which over-lap with minority instances. The removal majority instances are considered based on their safe levels relying on the Safe-Level-SMOTE concept. MUTE not only reduces the classifier construction time because of a downsizing dataset but also improves the prediction rate on a minority class. The experimental results show that MUTE improves F-measure by comparing to SMOTE techniques.
机译:在不平衡数据集上运行的应用程序会失去少数类的分类性能,这是罕见且重要的。有许多种过采样技术,可将少数实例插入数据集中以调整类分布。不幸的是,这些实例极大地影响了生成分类器的计算。本文提出了一种新的简单有效的欠采样MUTE。它的策略是摆脱与少数实例重叠的噪声多数实例。根据安全级别-SMOTE概念,根据其安全级别来考虑清除大多数实例。 MUTE不仅因为数据集的精简而减少了分类器的构建时间,而且还提高了少数类的预测率。实验结果表明,与SMOTE技术相比,MUTE改善了F测度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号