首页> 外文会议>Chinese Automation Congress >Softmax Cross Entropy Loss with Unbiased Decision Boundary for Image Classification
【24h】

Softmax Cross Entropy Loss with Unbiased Decision Boundary for Image Classification

机译:Softmax交叉熵损失与图像分类的非偏见决策边界

获取原文

摘要

Considering that in neural network based on softmax cross entropy loss, the output probability is mainly based on linear computation of parameter vectors of each class in the last layer and hidden features in the layer of sample points. Therefore, the final output of neural network is effected by of the L2-norm of parameter vector of each class. Taking binary-class as an example, if the parameter vector of a class has a large L2-norm., decision boundary is close to another class with smaller L2-norm., so that sample points will be easily assigned to the class with large L2-norm. Based on it, this paper proposes a new softmax cross entropy loss, which adjusts the position of decision boundary so that it is not biased to any class. Experimental results on the LabelMe dataset and the UIUC-Sports dataset show that the proposed loss is superior to softmax cross entropy loss.
机译:考虑到在基于SoftMax交叉熵损失的神经网络中,输出概率主要基于在上层中的每个类的参数向量的线性计算和样本点层中的隐藏特征。因此,神经网络的最终输出是由每个类的参数向量的L2-norm实现的。以二进制类为例,如果类的参数向量具有大L2-常态。,决策边界接近另一个具有较小L2-常态的类。,使得采样点将很容易地分配给具有大的类。 L2-NOM。基于它,本文提出了一种新的Softmax交叉熵损失,调整决策边界的位置,使其不会偏向任何类。 Labelme数据集和UIUC-Sports数据集上的实验结果表明,所提出的损失优于Softmax交叉熵损失。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号