首页> 外文会议>Workshop on gender bias in natural language processing >Debiasing Embeddings for Reduced Gender Bias in Text Classification
【24h】

Debiasing Embeddings for Reduced Gender Bias in Text Classification

机译:脱叠嵌入文本分类中的性别偏差减少

获取原文

摘要

(Bolukbasi et al., 2016) demonstrated that pre-trained word embeddings can inherit gender bias from the data they were trained on. We investigate how this bias affects downstream classification tasks, using the case study of occupation classification (De-Arteaga et al., 2019). We show that traditional techniques for debiasing embeddings can actually worsen the bias of the downstream classifier by providing a less noisy channel for communicating gender information. With a relatively minor adjustment, however, we show how these same techniques can be used to simultaneously reduce bias and maintain high classification accuracy.
机译:(Bolukbasi等人,2016)展示了预先接受过的Word Embeddings可以从他们接受培训的数据中继承性别偏见。我们调查这种偏见如何利用职业分类的案例研究影响下游分类任务(De-Arteaga等,2019)。我们表明,通过为通信性别信息提供较少的嘈杂渠道,可以实际上可以使下游分级器的偏置来偏差来脱叠嵌入式的传统技术。然而,通过相对较小的调整,我们展示了这些相同的技术如何用于同时降低偏压并保持高分类精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号