首页> 外文会议>IEEE International Symposium on Information Theory >Active Learning for Classification with Abstention
【24h】

Active Learning for Classification with Abstention

机译:主动学习分类缺勤

获取原文

摘要

We consider the problem of binary classification with the caveat that the learner can abstain from declaring a label incurring a cost λ ∈ [0,1/2] in the process. This is referred to as the problem of binary classification with a fixed-cost of abstention. For this problem, we propose an active learning strategy that constructs a non-uniform partition of the input space and focuses sampling in the regions near the decision boundaries. Our proposed algorithm can work in all the commonly used active learning query models, namely membership-query, pool-based and stream-based. We obtain an upper bound on the excess risk of our proposed algorithm under standard smoothness and margin assumptions and demonstrate its minimax near-optimality by deriving a matching (modulo poly-logarithmic factors) lower bound. The achieved minimax rates are always faster than the corresponding rates in the passive setting, and furthermore the improvement increases with larger values of the smoothness and margin parameters.
机译:我们考虑到二进制分类的问题,但需要注意的是,在此过程中,学习者可以避免声明标签而导致成本λ∈[0,1 / 2]。这被称为具有固定弃权成本的二进制分类问题。对于此问题,我们提出了一种主动学习策略,该策略构造了输入空间的非均匀分区,并将采样集中在决策边界附近的区域中。我们提出的算法可以在所有常用的主动学习查询模型中工作,即隶属关系查询,基于池和基于流。我们在标准平滑度和裕度假设下获得了所提出算法的额外风险上限,并通过推导匹配(模对数因子)下限来证明其最小极大接近最优性。在被动设置中,所达到的minimax速率始终快于相应的速率,此外,随着平滑度和裕度参数的值增大,改进也随之增加。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号