...
【24h】

Positive-Unlabeled Learning with Non-Negative Risk Estimator

机译:非负面风险估算器的正面未标记的学习

获取原文
获取原文并翻译 | 示例

摘要

From only positive (P) and unlabeled (U) data, a binary classifier can be trained with PU learning, in which the state of the art is unbiased PU learning. However, if its model is very flexible, its empirical risk on training data wall go negative and we will suffer from serious overfitting. In this paper, we propose a non-negative risk estimator for PU learning. When being minimized, it is more robust against overfitting and thus we are able to train very flexible models given limited P data. Moreover, we analyze the bias, consistency and mean-squared-error reduction of the proposed risk estimator and the estimation error of the corresponding risk minimizer. Experiments show that the proposed risk estimator successfully fixes the overfitting problem of its unbiased counterparts.
机译:从仅阳性(P)和未标记的(U)数据,可以用PU学习训练二进制分类器,其中最先进的PU学习是无偏见的。 但是,如果它的模型非常灵活,其培训数据墙上的经验风险将会消极,我们将遭受严重的过度装备。 在本文中,我们提出了对PU学习的非负面风险估算。 当被最小化时,对过度装配更加稳健,因此我们能够培训给定受限P数据的非常灵活的模型。 此外,我们分析了所提出的风险估算器的偏置,一致性和平均误差误差和相应风险最小化器的估计误差。 实验表明,建议的风险估计者成功地修复了其无偏的对应物的过度装备问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号