首页> 外文会议>Computer safety, reliability, and security >Why Are People's Decisions Sometimes Worse with Computer Support?
【24h】

Why Are People's Decisions Sometimes Worse with Computer Support?

机译:为什么在计算机支持下人们的决策有时会变得更糟?

获取原文
获取原文并翻译 | 示例

摘要

In many applications of computerised decision support, a recognised source of undesired outcomes is operators' apparent over-reliance on automation. For instance, an operator may fail to react to a potentially dangerous situation because a computer fails to generate an alarm. However, the very use of terms like "over-reliance" betrays possible misunderstandings of these phenomena and their causes, which may lead to ineffective corrective action (e.g. training or procedures that do not counteract all the causes of the apparently "over-reliant" behaviour). We review relevant literature in the area of "automation bias" and describe the diverse mechanisms that may be involved in human errors when using computer support. We discuss these mechanisms, with reference to errors of omission when using "alerting systems", with the help of examples of novel counterintuitive findings we obtained from a case study in a health care application, as well as other examples from the literature.
机译:在许多计算机化决策支持应用程序中,公认的不良结果来源是操作员明显过分依赖自动化。例如,由于计算机无法生成警报,因此操作员可能无法对潜在的危险状况做出反应。但是,仅使用诸如“过度依赖”之类的字眼就可能会误解这些现象及其原因,从而可能导致纠正措施无效(例如,培训或程序不能抵消显然“过度依赖”的所有原因)行为)。我们回顾了“自动化偏差”领域的相关文献,并描述了使用计算机支持时人为错误可能涉及的各种机制。我们将参考在使用“警报系统”时的遗漏错误来讨论这些机制,并借助从医疗保健应用案例研究中获得的新颖的反直觉发现的示例,以及从文献中获得的其他示例。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号