首页> 外文OA文献 >Semi-supervised learning via regularized boosting working on multiple semi-supervised assumptions
【2h】

Semi-supervised learning via regularized boosting working on multiple semi-supervised assumptions

机译:通过正则化推进进行半监督学习,研究多个半监督假设

摘要

Semi-supervised learning concerns the problem of learning in the presence of labeled and unlabeled data. Several boosting algorithms have been extended to semi-supervised learning with various strategies. To our knowledge, however, none of them takes all three semi-supervised assumptions, i.e., smoothness, cluster, and manifold assumptions, together into account during boosting learning. In this paper, we propose a novel cost functional consisting of the margin cost on labeled data and the regularization penalty on unlabeled data based on three fundamental semi-supervised assumptions. Thus, minimizing our proposed cost functional with a greedy yet stagewise functional optimization procedure leads to a generic boosting framework for semi-supervised learning. Extensive experiments demonstrate that our algorithm yields favorite results for benchmark and real-world classification tasks in comparison to state-of-the-art semi-supervised learning algorithms, including newly developed boosting algorithms. Finally, we discuss relevant issues and relate our algorithm to the previous work. © 2006 IEEE.
机译:半监督学习涉及存在标记和未标记数据的学习问题。几种增强算法已扩展为具有各种策略的半监督学习。然而据我们所知,在促进学习过程中,没有一个将所有三个半监督假设(即平滑度,聚类和多个假设)一起考虑在内。在本文中,我们基于三个基本的半监督假设,提出了一种新的成本函数,该函数包括标记数据的保证金成本和未标记数据的正则化惩罚。因此,用贪婪但分阶段的功能优化过程将我们提出的成本函数最小化,会导致通用的半监督学习框架。广泛的实验表明,与最新的半监督学习算法(包括新开发的增强算法)相比,我们的算法在基准测试和实际分类任务中产生了令人满意的结果。最后,我们讨论相关问题并将我们的算法与先前的工作联系起来。 ©2006 IEEE。

著录项

  • 作者

    Chen Ke; Wang Shihai;

  • 作者单位
  • 年度 2011
  • 总页数
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类

相似文献

  • 外文文献
  • 中文文献
  • 专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号