首页> 外文会议>IEEE International Conference on Data Mining >Stability-Based Stopping Criterion for Active Learning
【24h】

Stability-Based Stopping Criterion for Active Learning

机译:基于稳定性的主动学习停止准则

获取原文

摘要

While active learning has drawn broad attention in recent years, there are relatively few studies on stopping criterion for active learning. We here propose a novel model stability based stopping criterion, which considers the potential of each unlabeled examples to change the model once added to the training set. The underlying motivation is that active learning should terminate when the model does not change much by adding remaining examples. Inspired by the widely used stochastic gradient update rule, we use the gradient of the loss at each candidate example to measure its capability to change the classifier. Under the model change rule, we stop active learning when the changing ability of all remaining unlabeled examples is less than a given threshold. We apply the stability-based stopping criterion to two popular classifiers: logistic regression and support vector machines (SVMs). It can be generalized to a wide spectrum of learning models. Substantial experimental results on various UCI benchmark data sets have demonstrated that the proposed approach outperforms state-of-art methods in most cases.
机译:近年来,尽管主动学习受到广泛关注,但关于停止主动学习准则的研究相对较少。我们在此提出一种基于模型稳定性的新颖停止准则,该准则考虑了每个未标记示例一旦添加到训练集后就可能更改模型的可能性。潜在的动机是,当模型没有太大变化时,通过添加其余示例,主动学习应终止。受广泛使用的随机梯度更新规则的启发,我们使用每个候选示例的损失梯度来衡量其更改分类器的能力。根据模型更改规则,当所有其余未标记示例的更改能力小于给定阈值时,我们将停止主动学习。我们将基于稳定性的停止准则应用于两个流行的分类器:逻辑回归和支持向量机(SVM)。它可以推广到各种各样的学习模型。在各种UCI基准数据集上的大量实验结果表明,在大多数情况下,所提出的方法优于最新方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号