首页> 外文会议>Uncertainty in artificial intelligence >Online Importance Weight Aware Updates
【24h】

Online Importance Weight Aware Updates

机译:在线重要性体重意识更新

获取原文
获取原文并翻译 | 示例

摘要

An importance weight quantifies the rela tive importance of one example over another, coming up in applications of boosting, asym metric classification costs, reductions, and active learning. The standard approach for dealing with importance weights in gradient descent is via multiplication of the gradient. We first demonstrate the problems of this ap proach when importance weights are large, and argue in favor of more sophisticated ways for dealing with them. We then develop an approach which enjoys an invariance prop erty: that updating twice with importance weight h is equivalent to updating once with importance weight 2ft. For many important losses this has a closed form update which satisfies standard regret guarantees when all examples have h = 1. We also briefly dis cuss two other reasonable approaches for han dling large importance weights. Empirically, these approaches yield substantially superior prediction with similar computational perfor mance while reducing the sensitivity of the algorithm to the exact setting of the learning rate. We apply these to online active learning yielding an extraordinarily fast active learn ing algorithm that works even in the presence of adversarial noise.
机译:重要性权重量化了一个示例相对于另一个示例的相对重要性,这在增强,不对称分类成本,减少量和主动学习的应用中出现。在梯度下降中处理重要性权重的标准方法是通过梯度的乘积。我们首先说明重要性权重较大时此方法的问题,并主张采用更复杂的方法来处理它们。然后,我们开发一种具有不变性的方法:用重要性权重h更新两次等于用重要性权重2ft更新一次。对于许多重要损失,当所有示例均具有h = 1时,具有封闭式更新,可以满足标准遗憾保证。我们还简要讨论了处理较大重要性权重的其他两种合理方法。从经验上讲,这些方法在具有相似的计算性能的情况下产生了实质上优越的预测,同时降低了算法对学习速率的精确设置的敏感性。我们将这些方法应用于在线主动学习,从而产生了一种即使在存在对抗性噪声的情况下也能正常工作的快速主动学习算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号