【24h】

Subset Ranking Using Regression

机译:使用回归的子集排名

获取原文
获取原文并翻译 | 示例

摘要

We study the subset ranking problem, motivated by its important application in web-search. In this context, we consider the standard DCG criterion (discounted cumulated gain) that measures the quality of items near the top of the rank-list. Similar to error minimization for binary classification, the DCG criterion leads to a non-convex optimization problem that can be NP-hard. Therefore a computationally more tractable approach is needed. We present bounds that relate the approximate optimization of DCG to the approximate minimization of certain regression errors. These bounds justify the use of convex learning formulations for solving the subset ranking problem. The resulting estimation methods are not conventional, in that we focus on the estimation quality in the top-portion of the rank-list. We further investigate the generalization ability of these formulations. Under appropriate conditions, the consistency of the estimation schemes with respect to the DCG metric can be derived.
机译:我们研究了子集排名问题,这是由其在网络搜索中的重要应用引起的。在这种情况下,我们考虑了标准的DCG标准(折现的累计收益),该标准衡量排名列表顶部附近的商品的质量。与二进制分类的错误最小化类似,DCG准则会导致非凸优化问题,该问题可能是NP困难的。因此,需要一种在计算上更易于处理的方法。我们提出了将DCG的近似优化与某些回归误差的近似最小化相关的界限。这些界限证明使用凸学习公式来解决子集排名问题。由此产生的估计方法不是常规的,因为我们关注排名列表顶部的估计质量。我们进一步研究了这些制剂的泛化能力。在适当的条件下,可以得出估计方案相对于DCG度量的一致性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号