首页> 外文期刊>Machine Learning >Joint consensus and diversity for multi-view semi-supervised classification
【24h】

Joint consensus and diversity for multi-view semi-supervised classification

机译:多视图半监督分类的共同共识和多样性

获取原文
获取原文并翻译 | 示例

摘要

As data can be acquired in an ever-increasing number of ways, multi-view data is becoming more and more available. Considering the high price of labeling data in many machine learning applications, we focus on multi-view semi-supervised classification problem. To address this problem, in this paper, we propose a method called joint consensus and diversity for multi-view semi-supervised classification, which learns a common label matrix for all training samples and view-specific classifiers simultaneously. A novel classification loss named probabilistic square hinge loss is proposed, which avoids the incorrect penalization problem and characterizes the contribution of training samples according to its uncertainty. Power mean is introduced to incorporate the losses of different views, which contains the auto-weighted strategy as a special case and distinguishes the importance of various views. To solve the non-convex minimization problem, we prove that its solution can be obtained from another problem with introduced variables. And an efficient algorithm with proved convergence is developed for optimization. Extensive experimental results on nine datasets demonstrate the effectiveness of the proposed algorithm.
机译:随着可以越来越多的方式获取数据,多视图数据变得越来越可用。考虑到在许多机器学习应用程序中标记数据的高昂价格,我们将重点放在多视图半监督分类问题上。为了解决这个问题,在本文中,我们提出了一种称为联合共识和多样性的多视图半监督分类方法,该方法同时为所有训练样本和特定于视图的分类器学习一个公共标签矩阵。提出了一种新颖的分类损失,称为概率平方铰链损失,它避免了不正确的惩罚问题,并根据训练样本的不确定性来表征训练样本的贡献。引入幂均值以合并不同视图的损失,该方法包含自动加权策略作为特例,并区分了各种视图的重要性。为了解决非凸最小化问题,我们证明可以从引入变量的另一个问题中获得其解决方案。并开发了一种具有证明收敛性的有效算法进行优化。在9个数据集上的大量实验结果证明了该算法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号