首页> 外文期刊>Engineering Applications of Artificial Intelligence >UnderBagging based reduced Kernelized weighted extreme learning machine for class imbalance learning
【24h】

UnderBagging based reduced Kernelized weighted extreme learning machine for class imbalance learning

机译:基于UnderBagging的简化核加权加权极限学习机,用于班级失衡学习

获取原文
获取原文并翻译 | 示例
           

摘要

Extreme learning machine (ELM) is one of the foremost capable, quick genuine esteemed classification algorithm with good generalization performance. Conventional ELM does not take into account the class imbalance problem effectively. Numerous variants of ELM-like weighted ELM (WELM), Boosting WELM (BWELM) etc. have been proposed in order to diminish the performance degradation which happens due to the class imbalance problem. This work proposed a novel Reduced Kernelized WELM (RKWELM) which is a variant of kernelized WELM to handle the class imbalance problem more effectively. The performance of RKWELM varies due to the arbitrary selection of the kernel centroids. To reduce this variation, this work uses ensemble method. The computational complexity of kernelized ELM (KELM) is subject to the number of kernels. KELM generally employ Gaussian kernel function. It employs all of the training instances to act as the centroid. This will lead to computation of the pseudoinverse ofN×Nmatrix. Here,Nrepresents the number of training instances. This operation becomes very slow for the large values ofN. Moreover, for the imbalanced classification problems, using all the training instances as the centroid will result in more number of centroids representing the majority class compared to the centroids representing the minority class. This might lead to biased classification model, which favors the majority class instances. So, this work uses a subset of the training instances as the centroid of the kernels. RKWELM arbitrarily choosesNmininstances from each class which acts as the centroid. The total number of centroids will beN˜=m×Nmin. Here,mrepresents the number of classes andNminis the number of instances belonging to the minority class which has the least number of instances. This reduction in the number of kernels will lead to reduced kernel matrix of size,N˜×N˜leading to decrease in the computational complexity. This work creates a number of balanced kernel subsets depending on the degree of class imbalance. A number of RKWELM based classification models are produced utilizing these balanced kernel subsets. The ultimate outcome is computed by the majority voting and the soft voting of these classification models. The proposed algorithm is assessed by using the benchmark real-world imbalanced datasets downloaded from the KEEL dataset repository. The experimental results indicate the superiority of the proposed work in contrast with the rest of classifiers for the imbalanced classification problems.
机译:极限学习机(ELM)是功能最强大,快速,真正受人尊敬的分类算法之一,具有良好的泛化性能。传统的ELM无法有效地考虑班级不平衡问题。为了减少由于类不平衡问题而导致的性能下降,已经提出了类似ELM的加权ELM(WELM),Boosting WELM(BWELM)等多种变体。这项工作提出了一种新颖的精简内核化WELM(RKWELM),它是内核化WELM的一种变体,可以更有效地处理类不平衡问题。 RKWELM的性能因内核质心的任意选择而异。为了减少这种变化,本文采用集成方法。内核化ELM(KELM)的计算复杂度取决于内核数。 KELM通常采用高斯核函数。它使用所有训练实例来充当质心。这将导致计算N×N矩阵的伪逆。在此,N表示训练实例的数量。对于较大的N值,此操作变得非常慢。此外,对于不平衡的分类问题,将所有训练实例用作质心会导致代表多数类的质心数量比代表少数类的质心数量更多。这可能会导致有偏见的分类模型,从而偏爱多数类实例。因此,这项工作将训练实例的子集用作内核的质心。 RKWELM从每个充当质心的类中任意选择Nmininstances。重心的总数将为N〜= m×Nmin。在此,m表示类的数量,Nmini表示属于少数类的实例数量最少的实例的数量。内核数量的这种减少将导致大小为N〜×N〜的内核矩阵的减少,从而导致计算复杂度的降低。这项工作根据类不平衡的程度创建了许多平衡的内核子集。利用这些平衡的内核子集,可以生成许多基于RKWELM的分类模型。最终结果是通过这些分类模型的多数表决和软表决来计算的。通过使用从KEEL数据集存储库下载的基准现实世界不平衡数据集来评估提出的算法。实验结果表明,与其他分类器相比,所提出工作的优势在于不平衡的分类问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号