【24h】

Allocation of Moral Decision-Making in Human-Agent Teams: A Pattern Approach

机译:代理人团队中道德决策的分配:一种模式方法

获取原文

摘要

Artificially intelligent agents will deal with more morally sensitive situations as the field of AI progresses. Research efforts are made to regulate, design and build Artificial Moral Agents (AMAs) capable of making moral decisions. This research is highly multidisciplinary with each their own jargon and vision, and so far it is unclear whether a fully autonomous AMA can be achieved. To specify currently available solutions and structure an accessible discussion around them, we propose to apply Team Design Patterns (TDPs). The language of TDPs describe (visually, textually and formally) a dynamic allocation of tasks for moral decision making in a human-agent team context. A task decomposition is proposed on moral decision-making and AMA capabilities to help define such TDPs. Four TDPs are given as examples to illustrate the versatility of the approach. Two problem scenarios (surgical robots and drone surveillance) are used to illustrate these patterns. Finally, we discuss in detail the advantages and disadvantages of a TDP approach to moral decision making.
机译:随着AI领域的发展,人工智能代理将处理更多对道德敏感的情况。已经进行了研究工作,以调节,设计和构建能够做出道德决定的人工道德代理(AMA)。这项研究是高度多学科的,各有各的术语和视野,到目前为止,尚不清楚是否可以实现完全自主的AMA。为了指定当前可用的解决方案并围绕它们进行可访问的讨论,我们建议应用团队设计模式(TDP)。 TDP的语言(在视觉,文字和形式上)描述了在人为代理团队环境中进行道德决策的动态任务分配。建议对道德决策和AMA功能进行任务分解,以帮助定义此类TDP。给出了四个TDP作为示例,以说明该方法的多功能性。两种问题场景(手术机器人和无人机监控)用于说明这些模式。最后,我们详细讨论了TDP道德决策方法的优缺点。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号