首页> 外文期刊>International journal of software engineering and knowledge engineering >Binary Independence Language Model in a Relevance Feedback Environment
【24h】

Binary Independence Language Model in a Relevance Feedback Environment

机译:相关反馈环境中的二元独立语言模型

获取原文
获取原文并翻译 | 示例
       

摘要

Model construction is a kind of knowledge engineering, and building retrieval models is critical to the success of search engines. This article proposes a new (retrieval) language model, called binary independence language model (BILM). It integrates two document-context based language models together into one by the log-odds ratio where these two are language models applied to describe document-contexts of query terms. One model is based on relevance information while the other is based on the non-relevance information. Each model incorporates link dependencies and multiple query term dependencies. The probabilities are interpolated between the relative frequency and the background probabilities. In a simulated relevance feedback environment of top 20 judged documents, our BILM performed statistically significantly better than the other highly effective retrieval models at 95% confidence level across four TREC collections using fixed parameter values for the mean average precision. For the less stable performance measure (i.e. precision at the top 10), no statistical significance is shown between the different models for the individual test collections although numerically our BILM is better than two other models with a confidence level of 95% based on a paired sign test across the test collections of both relevance feedback and retrospective experiments.
机译:模型构建是一种知识工程,构建检索模型对于搜索引擎的成功至关重要。本文提出了一种新的(检索式)语言模型,称为二进制独立语言模型(BILM)。它通过对数比将两种基于文档上下文的语言模型集成到一个模型中,其中这两种语言模型用于描述查询术语的文档上下文。一种模型基于相关性信息,而另一种模型基于非相关性信息。每个模型都包含链接依赖性和多个查询项依赖性。在相对频率和背景概率之间插值概率。在前20个评判文档的模拟相关性反馈环境中,我们的BILM在四个TREC集合中使用固定参数值作为平均平均精度,在95%置信度上的表现明显优于其他高效检索模型。对于性能较不稳定的指标(即前10位的精度),各个测试集的不同模型之间没有显示统计显着性,尽管在数值上我们的BILM优于其他两个模型,基于配对的置信度为95%在相关反馈和回顾性实验的测试集合中进行符号测试。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号