首页> 外文会议>International Joint Conference on Neural Networks >Minority Oversampling Using Sensitivity
【24h】

Minority Oversampling Using Sensitivity

机译:使用灵敏度的少数族裔过采样

获取原文

摘要

The Synthetic Minority Oversampling Technique (SMOTE) is effective to handle imbalance classification problems. However, the random candidate selection of SMOTE may lead to severe overlap between classes and introduce new noise factors. Many variants of SMOTE have been proposed to relieve these problems by generating new examples in safe regions. Most of these methods generate new examples with existing minority examples without considering the negative impact that class imbalance have brought on these examples. In this paper, we handle the imbalance classification using Bayes’ decision rule and propose a novel oversampling method, the Minority Oversampling using Sensitivity (MOSS). Candidates for new example generations are selected considering their sensitivity with respect to class imbalance. New examples are then generated by interpolating the candidate and one of its adjacent examples. Experiments on 30 datasets confirm the superiority of the MOSS against one baseline method and seven oversampling methods.
机译:综合少数群体过采样技术(SMOTE)可有效处理不平衡分类问题。但是,随机选择SMOTE可能会导致类别之间严重重叠,并引入新的噪声因子。已经提出了SMOTE的许多变体,以通过在安全区域中生成新示例来缓解这些问题。这些方法中的大多数都在不考虑阶级失衡给这些例子带来负面影响的情况下,利用现有的少数例子产生了新的例子。在本文中,我们使用贝叶斯决策规则处理不平衡分类,并提出了一种新颖的过采样方法,即使用灵敏度的少数群体过采样(MOSS)。选择新一代示例的候选人时要考虑他们对阶级失衡的敏感性。然后通过内插候选者及其相邻示例之一来生成新示例。在30个数据集上进行的实验证实了MOSS相对于一种基准方法和七种过采样方法的优越性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号