首页> 外文期刊>Information retrieval >Gradient descent optimization of smoothed information retrieval metrics
【24h】

Gradient descent optimization of smoothed information retrieval metrics

机译:平滑的信息检索指标的梯度下降优化

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Most ranking algorithms are based on the optimization of some loss functions, such as the pairwise loss. However, these loss functions are often different from the criteria that are adopted to measure the quality of the web page ranking results. To overcome this problem, we propose an algorithm which aims at directly optimizing popular measures such as the Normalized Discounted Cumulative Gain and the Average Precision. The basic idea is to minimize a smooth approximation of these measures with gradient descent. Crucial to this kind of approach is the choice of the smoothing factor. We provide various theoretical analysis on that choice and propose an annealing algorithm to iteratively minimize a less and less smoothed approximation of the measure of interest. Results on the Letor benchmark datasets show that the proposed algorithm achieves state-of-the-art performances.
机译:大多数排序算法都基于某些损失函数的优化,例如成对损失。但是,这些损失函数通常不同于用来衡量网页排名结果质量的标准。为了克服这个问题,我们提出了一种算法,旨在直接优化流行的度量,例如归一化贴现累积增益和平均精度。基本思想是在梯度下降时最小化这些测量值的平滑近似。选择这种方法的关键是平滑因子的选择。我们提供了关于该选择的各种理论分析,并提出了一种退火算法来迭代地最小化所关注度量的越来越少的平滑近似。 Letor基准数据集上的结果表明,该算法达到了最新的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号