首页> 外文会议>Algorithmic learning theory >Supervised Learning and Co-training
【24h】

Supervised Learning and Co-training

机译:有监督的学习和联合培训

获取原文
获取原文并翻译 | 示例

摘要

Co-training under the Conditional Independence Assumption is among the models which demonstrate how radically the need for labeled data can be reduced if a huge amount of unlabeled data is available. In this paper, we explore how much credit for this saving must be assigned solely to the extra-assumptions underlying the Co-training model. To this end, we compute general (almost tight) upper and lower bounds on the sample size needed to achieve the success criterion of PAC-learning within the model of Co-training under the Conditional Independence Assumption in a purely supervised setting. The upper bounds lie significantly below the lower bounds for PAC-learning without Co-training. Thus, Co-training saves labeled data even when not combined with unlabeled data. On the other hand, the saving is much less radical than the known savings in the semi-supervised setting.
机译:在模型中,有条件独立假设下的协同训练表明,如果有大量未标记数据可用,那么如何从根本上减少对标记数据的需求。在本文中,我们探讨了必须为此节省多少功劳,而这些功劳必须仅分配给共同训练模型基础的额外假设。为此,我们在纯监督条件下,在条件独立假设下的协同训练模型中,为达到PAC学习的成功标准而需要的样本数量计算一般(几乎紧密)的上限和下限。对于没有共同训练的PAC学习,上限明显低于下限。因此,即使没有与未标记的数据结合使用,联合训练也可以保存标记的数据。另一方面,与半监督设置中的已知节省相比,该节省要少得多。

著录项

  • 来源
    《Algorithmic learning theory》|2011年|p.425-439|共15页
  • 会议地点 Espoo(FI);Espoo(FI)
  • 作者单位

    Fakultat fur Mathematik, Ruhr-Universitat Bochum D-44780 Bochum, Germany;

    Fakultat fur Mathematik, Ruhr-Universitat Bochum D-44780 Bochum, Germany;

    Hungarian Academy of Sciences and University of Szeged Research Group on Artificial Intelligence H-6720 Szeged, Hungary;

  • 会议组织
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类 人工智能理论;
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号