首页> 外文会议>IEEE International Symposium on Information Theory >Adaptive Label Smoothing for Classifier-based Mutual Information Neural Estimation
【24h】

Adaptive Label Smoothing for Classifier-based Mutual Information Neural Estimation

机译:基于分类的共同信息神经估计的自适应标记平滑

获取原文

摘要

Estimating the mutual information (MI) by neural networks has achieved significant practical success, especially in representation learning. Recent results further reduced the variance in the neural estimation by training a probabilistic classifier. However, the trained classifier tends to be overly confident about some of its predictions, which results in an overestimated MI that fails to capture the desired representation. To soften the classifier, we propose a novel scheme that smooths the label adaptively according to how extreme the probability estimates are. The resulting MI estimate is unbiased under a mild assumption on the model. Experimental results on MNIST and CIFAR10 datasets confirmed that our method yields better representation and achieves higher classification test accuracy among existing approaches in self-supervised representation learning.
机译:通过神经网络估算互信息(MI)取得了显着的实际成功,特别是在代表学习中。 最近的结果通过训练概率分类器进一步降低了神经估计的方差。 然而,训练有素的分类器倾向于过度自信的一些预测,这导致未曝光的MI,其无法捕获所需的表示。 为了软化分类器,我们提出了一种新颖的方案,根据概率估计的极端如何,自适应地平滑标签。 由此产生的MI估计在模型上的温和假设下是无偏的。 MNIST和CIFAR10数据集的实验结果证实,我们的方法在自我监督代表学习中的现有方法中获得更好的代表性并实现更高的分类测试精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号