首页> 外文期刊>Information retrieval >Tasks, Topics And Relevance Judging For The Trec Genomics Track: Five Years Of Experience Evaluating Biomedical Text Information Retrieval Systems
【24h】

Tasks, Topics And Relevance Judging For The Trec Genomics Track: Five Years Of Experience Evaluating Biomedical Text Information Retrieval Systems

机译:Trec基因组学路线的任务,主题和相关性判断:五年经验评估生物医学文本信息检索系统

获取原文
获取原文并翻译 | 示例
       

摘要

With the help of a team of expert biologist judges, the TREC Genomics track has generated four large sets of "gold standard" test collections, comprised of over a hundred unique topics, two kinds of ad hoc retrieval tasks, and their corresponding relevance judgments. Over the years of the track, increasingly complex tasks necessitated the creation of judging tools and training guidelines to accommodate teams of part-time short-term workers from a variety of specialized biological scientific backgrounds, and to address consistency and reproducibility of the assessment process. Important lessons were learned about factors that influenced the utility of the test collections including topic design, annotations provided by judges, methods used for identifying and training judges, and providing a central moderator "meta-judge".
机译:在一组专家生物学专家的帮助下,TREC基因组学课程已经生成了四套大型的“金标准”测试集,其中包括一百多个独特的主题,两种特殊的检索任务及其相应的相关性判断。在过去的几年中,越来越复杂的任务需要创建判断工具和培训指南,以容纳来自各种专业生物学背景的兼职短期工人团队,并解决评估过程的一致性和可重复性。汲取了重要的经验教训,了解了影响测试集效用的因素,包括主题设计,法官提供的注释,用于鉴定和培训法官的方法以及提供中央主持人“元法官”。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号