...
首页> 外文期刊>Pattern Recognition: The Journal of the Pattern Recognition Society >A closed-form reduction of multi-class cost-sensitive learning to weighted multi-class learning
【24h】

A closed-form reduction of multi-class cost-sensitive learning to weighted multi-class learning

机译:封闭式将多类成本敏感型学习简化为加权多类学习

获取原文
获取原文并翻译 | 示例

摘要

in cost-sensitive learning, misclassification costs can vary for different classes. This paper investigates an approach reducing a multi-class cost-sensitive learning to a standard classification task based on the data space expansion technique developed by Abe et al., which coincides with Elkan's reduction with respect to binary classification tasks. Using this proposed reduction approach, a cost-sensitive learning problem can be solved by considering a standard 0/1 loss classification problem on a new distribution determined by the cost matrix. We also propose a new weighting mechanism to solve the reduced standard classification problem, based on a theorem stating that the empirical loss on independently identically distributed samples from the new distribution is essentially the same as the loss on the expanded weighted training set. Experimental results on several synthetic and benchmark datasets show that our weighting approach is more effective than existing representative approaches for cost-sensitive learning.
机译:在对成本敏感的学习中,分类错误的成本因类别不同而异。本文研究了一种基于Abe等人开发的数据空间扩展技术将多类成本敏感型学习简化为标准分类任务的方法,该方法与Elkan对二进制分类任务的简化相吻合。使用此提议的缩减方法,可以通过考虑由成本矩阵确定的新分布上的标准0/1损失分类问题来解决成本敏感的学习问题。我们还基于一个定理,提出了一种新的加权机制来解决简化的标准分类问题,该定理指出,新分布中独立均匀分布的样本的经验损失与扩展加权训练集的损失基本相同。在几个综合和基准数据集上的实验结果表明,对于成本敏感型学习,我们的加权方法比现有的代表性方法更有效。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号