【24h】

Why Not Artificial Sympathy?

机译:为什么不人工同情?

获取原文

摘要

"Empathy" and "Sympathy" are often confusingly used. Beside the difference in their usage, the key component could be a sort of emotional state to be shared, and the way to represent or manipulate it might be different. This could be clearer when we attempt to design it for artificial agents. This paper argues what are differences between empathy and sympathy, and how to design each of them for an artificial agent. First, the dictionary meaning of both is reviewed, and a metaphor to intuitively explain the difference is introduced. Next, we argue how artificial empathy and artificial sympathy can be designed, and a cognitive developmental robotics is introduced as a promising approach to the latter, especially from a viewpoint of learning and development. A rough design for artificial sympathy is argued, and preliminary studies needed to build the artificial sympathy are introduced. Finally, future issues are given.
机译:“ Empathy”和“ Sympathy”经常被混淆使用。除了用法上的差异外,关键要素可能是一种可以共享的情感状态,而表示或操纵它的方式也可能有所不同。当我们尝试为人工制剂设计时,这一点可能会更清楚。本文论述了同情和同情之间的区别是什么,以及如何为人工代理设计它们。首先,回顾了两者的字典含义,并引入了一个隐喻来直观地解释它们之间的差异。接下来,我们争论如何设计人工同情和人工同情,并引入认知发展机器人作为对后者的一种有前途的方法,特别是从学习和发展的角度来看。提出了对人为同情的粗略设计,并介绍了建立人为同情所需的初步研究。最后,给出了未来的问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号