...
首页> 外文期刊>Journal of Institutional and Theoretical Economics >When Algorithms Import Private Bias into Public Enforcement: The Promise and Limitations of Statistical Debiasing Solutions
【24h】

When Algorithms Import Private Bias into Public Enforcement: The Promise and Limitations of Statistical Debiasing Solutions

机译:当算法将私人偏见导入公共执法时:统计偏差解决方案的承诺和局限性

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

We make two contributions to understanding the role of algorithms in regulatory enforcement. First, we illustrate how big-data analytics can inadvertently import private biases into public policy. We show that a much-hyped use of predictive analytics - using consumer data to target food-safety enforcement - can disproportionately harm Asian establishments. Second, we study a solution by Pope and Sydnor ( 2011), which aims to debias predictors via marginalization, while still using information of contested predictors. We find the solution may be limited when protected groups have distinct predictor distributions, due to model extrapolation. Common machine-learning techniques heighten these problems.
机译:我们为理解算法在法规执行中的作用做出了两个贡献。首先,我们说明大数据分析如何无意间将私人偏见引入公共政策。我们表明,对预测分析的大肆宣传(使用消费者数据来定位食品安全执法)会严重损害亚洲企业。其次,我们研究了Pope和Sydnor(2011)提出的解决方案,该解决方案旨在通过边际化来消除预测变量的偏见,同时仍使用有争议的预测变量的信息。我们发现,由于模型外推,当受保护的群体具有不同的预测变量分布时,解决方案可能会受到限制。常见的机器学习技术加剧了这些问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号