首页> 外文会议>Chinese Automation Congress >Softmax Cross Entropy Loss with Unbiased Decision Boundary for Image Classification
【24h】

Softmax Cross Entropy Loss with Unbiased Decision Boundary for Image Classification

机译:具有无偏决策边界的Softmax交叉熵损失用于图像分类

获取原文

摘要

Considering that in neural network based on softmax cross entropy loss, the output probability is mainly based on linear computation of parameter vectors of each class in the last layer and hidden features in the layer of sample points. Therefore, the final output of neural network is effected by of the L2-norm of parameter vector of each class. Taking binary-class as an example, if the parameter vector of a class has a large L2-norm., decision boundary is close to another class with smaller L2-norm., so that sample points will be easily assigned to the class with large L2-norm. Based on it, this paper proposes a new softmax cross entropy loss, which adjusts the position of decision boundary so that it is not biased to any class. Experimental results on the LabelMe dataset and the UIUC-Sports dataset show that the proposed loss is superior to softmax cross entropy loss.
机译:考虑到在基于softmax交叉熵损失的神经网络中,输出概率主要基于最后一层中每个类的参数向量和采样点层中的隐藏特征的线性计算。因此,神经网络的最终输出受每一类参数向量的L2-范数影响。以二元类为例,如果一类的参数向量具有较大的L2-范数,则决策边界接近具有较小的L2-范数的另一类,因此可以轻松地将采样点分配给具有较大L2-范数的类L2范数。在此基础上,本文提出了一种新的softmax交叉熵损失,它可以调整决策边界的位置,以使其不会偏向任何类别。在LabelMe数据集和UIUC-Sports数据集上的实验结果表明,所提出的损失优于softmax交叉熵损失。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号