首页> 外文会议>Conference on uncertainty in artificial intelligence >Learning to Rank With Bregman Divergences and Monotone Retargeting
【24h】

Learning to Rank With Bregman Divergences and Monotone Retargeting

机译:通过Bregman散度和单调重定向来学习排名

获取原文
获取外文期刊封面目录资料

摘要

This paper introduces a novel approach for learning to rank (LETOR) based on the notion of monotone retargeting. It involves minimizing a divergence between all monotonic increasing transformations of the training scores and a parameterized prediction function. The minimization is both over the transformations as well as over the parameters. It is applied to Bregman divergences, a large class of "distance like" functions that were recently shown to be the unique class that is statistically consistent with the normalized discounted gain (NDCG) criterion [19]. The algorithm uses alternating projection style updates, in which one set of simultaneous projections can be computed independent of the Bregman divergence and the other reduces to parameter estimation of a generalized linear model. This results in easily implemented, efficiently paral-lelizable algorithm for the LETOR task that enjoys global optimum guarantees under mild conditions. We present empirical results on benchmark datasets showing that this approach can outperform the state of the art NDCG consistent techniques.
机译:本文介绍了一种基于单调重定位概念的学习排序的新方法(LETOR)。它涉及使训练得分的所有单调递增变换与参数化预测函数之间的差异最小化。最小化既涉及转换,也涉及参数。它适用于Bregman散度,Bregman散度是一大类“距离类似”函数,最近被证明是唯一的一类,在统计上与归一化折现收益(NDCG)准则一致[19]。该算法使用交替投影样式更新,其中可以独立于Bregman散度来计算一组同时投影,而另一组则可以简化为广义线性模型的参数估计。这样就为LETOR任务提供了易于实施,有效地可分区化的算法,该算法在温和条件下享有全局最优保证。我们在基准数据集上提供了经验结果,表明该方法可以胜过最新的NDCG一致技术。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号