首页> 美国卫生研究院文献>SpringerPlus >Question answering system using Q A site corpus Query expansion and answer candidate evaluation
【2h】

Question answering system using Q A site corpus Query expansion and answer candidate evaluation

机译:使用问答站点语料库的问答系统查询扩展和候选答案评估

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Question Answering (QA) is a task of answering natural language questions with adequate sentences. This paper proposes two methods to improve the performance of the QA system using a Q&A site corpus. The first method is for the relevant document retrieval module. We proposed modification of measure of mutual information for the query expansion; we calculate it between two words in each question and a word in its answer in the Q&A site corpus not to choose the words that are not suitable.The second method is for the candidate answer evaluation module. We proposed to evaluate candidate answers using the two measures together, i.e., the Web relevance score and the translation probability. The experiments were carried out using a Japanese Q&A site corpus. They revealed that the first proposed method was significantly better than the original method when their accuracies and MRR (Mean Reciprocal Rank) were compared and the second method was significantly better than the original methods when their MRR were compared.
机译:问题解答(QA)是用适当的句子回答自然语言问题的任务。本文提出了两种利用问答站点语料库来提高问答系统性能的方法。第一种方法是用于相关文档检索模块。我们提出了修改互信息量度以进行查询扩展的方法。我们在每个问题的两个词和答案站点语料库中的答案中的一个词之间计算该词,而不选择不合适的词。第二种方法是用于候选答案评估模块。我们建议同时使用这两种测量方法来评估候选答案,即Web相关性得分和翻译概率。实验是使用日本Q&A网站语料库进行的。他们发现,当比较其准确度和MRR(均值倒数排名)时,第一种方法明显优于原始方法,而当比较其MRR时,第二种方法则明显优于原始方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号