首页> 外文会议>Seventh joint conference on lexical and computational semantics >Examining Gender and Race Bias in Two Hundred Sentiment Analysis Systems
【24h】

Examining Gender and Race Bias in Two Hundred Sentiment Analysis Systems

机译:在两百种情感分析系统中检查性别和种族偏见

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Automatic machine learning systems can inadvertently accentuate and perpetuate inappropriate human biases. Past work on examining inappropriate biases has largely focused on just individual systems. Further, there is no benchmark dataset for examining inappropriate biases in systems. Here for the first time, we present the Equity Evaluation Corpus (EEC), which consists of 8,640 English sentences carefully chosen to tease out biases towards certain races and genders. We use the dataset to examine 219 automatic sentiment analysis systems that took part in a recent shared task, SemEval-2018 Task 1 'Affect in Tweets'. We find that several of the systems show statistically significant bias; that is. they consistently provide slightly higher sentiment intensity predictions for one race or one gender. We make the EEC freely available.
机译:自动机器学习系统会在不经意间加剧和使不适当的人为偏见长期存在。过去检查不当偏见的工作主要集中在单个系统上。此外,没有基准数据集可用于检查系统中的不当偏差。这是我们第一次在此展示股权评估语料库(EEC),该语料库由精心挑选的8,640个英语句子组成,目的是挑出对某些种族和性别的偏见。我们使用数据集检查了219个自动情感分析系统,这些系统参与了最近的一项共享任务SemEval-2018 Task 1'Affect in Tweets'。我们发现其中几个系统显示出统计上的显着偏差;那是。他们始终为一种种族或一种性别提供更高的情感强度预测。我们免费提供EEC。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号