首页> 外文期刊>ACM Transactions on Information Systems >Search Result Reranking with Visual and Structure Information Sources
【24h】

Search Result Reranking with Visual and Structure Information Sources

机译:使用视觉和结构信息源对搜索结果进行排名

获取原文
获取原文并翻译 | 示例
           

摘要

Relevance estimation is among the most important tasks in the ranking of search results. Current methodologies mainly concentrate on text matching, link analysis, and user behavior models. However, users judge the relevance of search results directly from Search Engine Result Pages (SERPs), which provide valuable signals for reranking. In this article, we propose two different approaches to aggregate the visual, structure, as well as textual information sources of search results in relevance estimation. The first one is a late-fusion framework named Joint Relevance Estimation model (JRE). JRE estimates the relevance independently from screenshots, textual contents, and HTML source codes of search results and jointly makes the final decision through an inter-modality attention mechanism. The second one is an early-fusion framework named Tree-based Deep Neural Network (TreeNN), which embeds the texts and images into the HTML parse tree through a recursive process. To evaluate the performance of the proposed models, we construct a large-scale practical Search Result Relevance (SRR) dataset that consists of multiple information sources and relevance labels of over 60,000 search results. Experimental results show that the proposed two models achieve better performance than state-of-the-art ranking solutions as well as the original rankings of commercial search engines.
机译:关联性估算是搜索结果排名中最重要的任务之一。当前的方法主要集中于文本匹配,链接分析和用户行为模型。但是,用户直接从搜索引擎结果页面(SERP)判断搜索结果的相关性,这为重新排名提供了有价值的信号。在本文中,我们提出了两种不同的方法来聚合相关性估计中搜索结果的视觉,结构以及文本信息源。第一个是后融合框架,称为联合相关性估计模型(JRE)。 JRE独立于屏幕截图,文本内容和搜索结果的HTML源代码来估计相关性,并通过一种跨模式注意机制共同做出最终决定。第二个是早期融合框架,称为基于树的深度神经网络(TreeNN),该框架通过递归过程将文本和图像嵌入HTML解析树。为了评估所提出模型的性能,我们构建了一个大规模的实用搜索结果相关性(SRR)数据集,该数据集由多个信息源和超过60,000个搜索结果的相关性标签组成。实验结果表明,与最新的排名解决方案以及商业搜索引擎的原始排名相比,所提出的两个模型具有更好的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号