首页> 外文会议>Neural and Stochastic Methods in Image and Signal Processing >Asymptotic improvement of supervised learning by utilizing additional unlabeled samples: normal mixture density case
【24h】

Asymptotic improvement of supervised learning by utilizing additional unlabeled samples: normal mixture density case

机译:通过使用其他未标记的样本渐进式改进监督学习:正常混合物密度情况

获取原文
获取原文并翻译 | 示例

摘要

Abstract: The effect of additional unlabeled samples in improving the supervised learning process is studied in this paper. Three learning processes, supervised, unsupervised, and combined supervised-unsupervised, are compared by studying the asymptotic behavior of the estimates obtained under each process. Upper and lower bounds on the asymptotic covariance matrices are derived. It is shown that under a normal mixture density assumption for the probability density function of the feature space, the combined supervised- unsupervised learning is always superior to the supervised learning in achieving better estimates. Experimental results are provided to verify the theoretical concepts. !10
机译:摘要:本文研究了其他未标记样本在改进监督学习过程中的作用。通过研究在每个过程下获得的估计的渐近行为,比较了有监督,无监督和有监督-无监督三个学习过程。得出渐近协方差矩阵的上限和下限。结果表明,在对特征空间的概率密度函数进行正常混合密度假设的情况下,组合的有监督-无监督学习在获得更好的估计方面总是优于有监督的学习。提供实验结果以验证理论概念。 !10

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号