...
首页> 外文期刊>IEEE Transactions on Knowledge and Data Engineering >Modeling the Parameter Interactions in Ranking SVM with Low-Rank Approximation
【24h】

Modeling the Parameter Interactions in Ranking SVM with Low-Rank Approximation

机译:使用低秩逼近对SVM排序中的参数交互建模

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Ranking SVM, which formalizes the problem of learning a ranking model as that of learning a binary SVM on preference pairs of documents, is a state-of-the-art ranking model in information retrieval. The dual form solution of a linear Ranking SVM model can be written as a linear combination of the preference pairs, i.e., w = Sigma((i,j)) alpha(ij) (x(i) - x(j)), where alpha(ij) denotes the Lagrange parameters associated with each preference pair (i, j). It is observed that there exist obvious interactions among the document pairs because two preference pairs could share a same document as their items, e.g., preference pairs (d(1), d(2)) and (d(1), d(3)) share the document d(1). Thus it is natural to ask if there also exist interactions over the model parameters alpha(ij), which may be leveraged to construct better ranking models. This paper aims to answer the question. We empirically found that there exists a low-rank structure over the rearranged Ranking SVM model parameters alpha(ij), which indicates that the interactions do exist. Based on the discovery, we made modifications on the original Ranking SVM model by explicitly applying low-rank constraints to the Lagrange parameters, achieving two novel algorithms called Factorized Ranking SVM and Regularized Ranking SVM, respectively. Specifically, in Factorized Ranking SVM each parameter alpha(ij) is decomposed as a product of two low-dimensional vectors, i.e., alpha(ij) = < v(i), v(j)>, where vectors v(i) and v(j) correspond to document i and j, respectively; In Regularized Ranking SVM, a nuclear norm is applied to the rearranged parameters matrix for controlling its rank. Experimental results on three LETOR datasets show that both of the proposed methods can outperform state-of-the-art learning to rank models including the conventional Ranking SVM.
机译:排名SVM是将学习排名模型的问题正式化为在文档首选项对上学习二进制SVM的问题的形式,它是信息检索中最先进的排名模型。线性Rank SVM模型的对偶解可以写为偏好对的线性组合,即w = Sigma((i,j))alpha(ij)(x(i)-x(j)),其中alpha(ij)表示与每个首选项对(i,j)相关的拉格朗日参数。可以看到,在文档对之间存在明显的交互作用,因为两个首选项对可以与其项目共享同一文档,例如,首选项对(d(1),d(2))和(d(1),d(3) ))分享文件d(1)。因此,很自然地要问是否在模型参数alpha(ij)上也存在交互,可以利用该交互来构建更好的排名模型。本文旨在回答这个问题。根据经验,我们发现在重新排列的Rank SVM模型参数alpha(ij)上存在一个低等级结构,这表明存在交互作用。基于这一发现,我们通过对Lagrange参数明确应用低秩约束来实现对原始Rank SVM模型的修改,分别实现了两种新颖的算法,分别称为Factorized Rank SVM和Regularized Rank SVM。具体而言,在因子分解排序支持向量机中,每个参数alpha(ij)分解为两个低维向量的乘积,即alpha(ij)= ,其中向量v(i)和v(j)分别对应于文档i和j;在正则化排序SVM中,将核规范应用于重新排列的参数矩阵以控制其排名。在三个LETOR数据集上的实验结果表明,所提出的两种方法均能胜过对包括传统的Rank SVM的模型进行排序的最新学习。

著录项

  • 来源
  • 作者单位

    Renmin Univ China, Sch Informat, Beijing Key Lab Big Data Management & Anal Method, Beijing, Peoples R China;

    Chinese Acad Sci, Inst Comp Technol, CAS Key Lab Network Data Sci & Technol, Beijing, Peoples R China;

    Chinese Acad Sci, Inst Comp Technol, CAS Key Lab Network Data Sci & Technol, Beijing, Peoples R China;

    Chinese Acad Sci, Inst Comp Technol, CAS Key Lab Network Data Sci & Technol, Beijing, Peoples R China;

    Chinese Acad Sci, Inst Comp Technol, CAS Key Lab Network Data Sci & Technol, Beijing, Peoples R China;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Learning to rank; ranking SVM; parameter interactions; low-rank approximation;

    机译:学习排名;排名SVM;参数相互作用;低秩近似;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号