首页> 外文会议>International Conference on Autonomous Agents and Multiagent Systems >Welcome to the Real World: How Agent Strategy Increases Human Willingness to Deceive
【24h】

Welcome to the Real World: How Agent Strategy Increases Human Willingness to Deceive

机译:欢迎来到现实世界:代理战略如何增加人民倾向的意愿

获取原文

摘要

Humans that negotiate through representatives often instruct those representatives to act in certain ways that align with both the client's goals and his or her social norms. However, which tactics and ethical norms humans endorse vary widely from person to person, and these endorsements may be easy to manipulate. This work presents the results of a study that demonstrates that humans that interact with an artificial agent may change what kinds of tactics and norms they endorse - often dramatically. Previous work has indicated that people that negotiate through artificial agent representatives may be more inclined to fairness than those people that negotiate directly. Our work qualifies that initial picture, demonstrating that subsequent experience may change this tendency toward fairness. By exposing human negotiators to tough, automated agents, we are able to shift the participant's willingness to deceive others and utilize "hard-ball" negotiation techniques. In short, what techniques people decide to endorse is dependent upon their context and experience. We examine the effects of interacting with four different types of automated agents, each with a unique strategy, and how this subsequently changes which strategies a human negotiator might later endorse. In the study, which was conducted on an online negotiation platform, four different types of automated agents negotiate with humans over the course of a 10-minute interaction. The agents differ in a 2×2 design according to agent strategy (tough vs. fair) and agent attitude (nice vs. nasty). These results show that in this multi-issue bargaining task, humans that interacted with a tough agent were more willing to endorse deceptive techniques when instructing their own representative. These kinds of techniques were endorsed even if the agent the human encountered did not use deception as part of its strategy. In contrast to some previous work, there was not a significant effect of agent attitude. These results indicate the power of allowing people to program agents that follow their instructions, but also indicate that these social norms and tactic endorsements may be mutable in the presence of real negotiation experience.
机译:通过代表谈判的人经常指示这些代表以某种方式行事,以与客户的目标及其社会规范一致。然而,哪种策略和道德规范人类认可与人的人民广泛变化,这些认可可能很容易操纵。这项工作提出了一项研究的结果,表明与人工剂交互的人类可能会改变他们认可的哪些策略和规范 - 通常会发生巨大。以前的工作表明,通过人工代表代表谈判的人可能比直接谈判的人更倾向于公平性。我们的工作有资格符合最初的图片,表明随后的经验可能会改变这种对公平性的趋势。通过将人类谈判者暴露于艰难的自动化者,我们能够将参与者愿意欺骗他人并利用“硬球”谈判技术。简而言之,人们决定支持的技术取决于他们的背景和经验。我们研究了与四种不同类型的自动化代理进行交互的影响,每个自动化剂具有独特的策略,以及随后如何改变人类谈判者可能以后认可的策略。在该研究中,在在线谈判平台上进行,四种不同类型的自动化代理商在10分钟的互动过程中与人类谈判。根据代理战略(强硬对比)和代理人态度(漂亮与讨厌),代理商在2×2设计中不同。这些结果表明,在这种多问题讨价还价的任务中,在指导自己的代表时更愿意互动的人互动。即使人类遭遇的代理人没有使用欺骗,也得到了这些类型的技术。与某些以前的工作相比,代理人态度没有显着影响。这些结果表明,允许人们遵守其指示的代理人的力量,也表明这些社会规范和策略认可可能在实际谈判经验存在下变得可变。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号