首页> 外文会议>International conference of the CLEF initiative >DIRECTions: Design and Specification of an IR Evaluation Infrastructure
【24h】

DIRECTions: Design and Specification of an IR Evaluation Infrastructure

机译:方向:红外评估基础架构的设计和规范

获取原文

摘要

Information Retrieval (IR) experimental evaluation is an essential part of the research on and development of information access methods and tools. Shared data sets and evaluation scenarios allow for comparing methods and systems, understanding their behaviour, and tracking performances and progress over the time. On the other hand, experimental evaluation is an expensive activity in terms of human effort, time, and costs required to carry it out. Software and hardware infrastructures that support experimental evaluation operation as well as management, enrichment, and exploitation of the produced scientific data provide a key contribution in reducing such effort and costs and carrying out systematic and throughout analysis and comparison of systems and methods, overall acting as en-ablers of scientific and technical advancement in the field. This paper describes the specification for an IR evaluation infrastructure by conceptually modeling the entities involved in IR experimental evaluation and their relationships and by defining the architecture of the proposed evaluation infrastructure and the APIs for accessing it.
机译:信息检索(IR)实验评估是研究和开发信息访问方法和工具的重要组成部分。共享的数据集和评估方案可用于比较方法和系统,了解其行为并跟踪一段时间内的性能和进度。另一方面,就进行评估所需的人力,时间和成本而言,实验评估是一项昂贵的活动。支持实验评估操作以及对生成的科学数据进行管理,充实和利用的软件和硬件基础设施,在减少此类工作和成本以及对系统和方法进行系统的,全面的分析和比较方面起到了关键作用。推动该领域的科学技术进步。本文通过对IR实验评估中涉及的实体及其关系进行概念建模,并通过定义所提议的评估基础架构和用于访问它的API来描述IR评估基础架构的规范。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号