...
【24h】

Algorithmic Fairness

机译:算法公平

获取原文
获取原文并翻译 | 示例
           

摘要

The growing use of algorithms in social and economic life has raised a concern: that they may inadvertently discriminate against certain groups. For example, one recent study found that natural language processing algorithms can embody basic gender biases, such as associating the word nurse more closely with the word she than with the word he (Caliskan, Bryson, and Narayanan 2017). Because the data used to train these algorithms are themselves tinged with stereotypes and past discrimination, it is natural to worry that biases are being "baked in."We consider this problem in the context of a specific but important case, one that is particularly amenable to economic analysis: using algorithmic predictions to guide decisions (Kleinberg et al. 2015). For example, predictions about a defendant's safety risk or flight risk are increasingly being proposed as a means to guide judge decisions about whether to grant bail. Discriminatory predictions in these cases could have large consequences. One can easily imagine how this could happen since recidivism predictions will be polluted by the fact that past arrests themselves may be racially biased. In fact, a recent ProPublica investigation arguedthat the risk tool used in one Florida county was in fact discriminatory (Angwin et al. 2016). This widely-read article helped further elevate concerns about fairness within the policy and research communities alike, with subsequent work showing that the trade-offs are more subtle than was initially apparent.
机译:算法在社会和经济生活中的使用日益广泛,引起了人们的关注:它们可能会无意中歧视某些群体。例如,最近的一项研究发现自然语言处理算法可以体现基本的性别偏见,例如将护士一词与她一词而不是他一词更紧密地关联(Caliskan,Bryson和Narayanan 2017)。由于用于训练这些算法的数据本身都带有定型观念和过去的偏见,因此自然会担心偏见被“掩盖”。我们在一个特定但重要的情况下考虑此问题,这是一个特别可解决的情况经济分析:使用算法预测来指导决策(Kleinberg等,2015)。例如,越来越多地提出关于被告的安全风险或飞行风险的预测,以此作为指导法官做出是否准予保释的决定的手段。在这些情况下的歧视性预测可能会产生重大后果。可以轻易想象这是怎么发生的,因为过去的逮捕本身可能会因种族偏见而被累犯的预测所污染。实际上,ProPublica最近的一项调查认为,佛罗里达州一个县使用的风险工具实际上是歧视性的(Angwin等人,2016年)。这篇广为阅读的文章进一步加剧了人们对政策界和研究界对公平性的担忧,其后的工作表明,这种权衡比最初看起来要微妙得多。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号