首页> 外文学位 >Assessing Implicit Gender and Ethnic Biases in Selecting Individuals for Software Developer Positions
【24h】

Assessing Implicit Gender and Ethnic Biases in Selecting Individuals for Software Developer Positions

机译:在选择个人担任软件开发人员职位时评估内隐的性别偏见和种族偏见

获取原文
获取原文并翻译 | 示例

摘要

This thesis presents a pre-research trial and a preliminary exploration of implicit bias based on the researcher's interest in ethics and the transfer of implicit bias to artificial intelligence (AI) applications. Human knowledge is used at some point in the development of knowledge bases for AI applications. As a result, the researcher was interested in determining whether humans do have implicit ethnic and/or gender biases. This pre-research trial is focused on whether implicit human bias exists and therefore might be transferred to AI applications. The research is limited to exploring implicit gender and ethnic biases in human respondents only. Pre-test and post-test questionnaires are completed by each respondent to collect sample data, e.g., gender, age, and ethnic background. A toy task was administered that required the respondents to rate very similar resumes of fictional job applicants. Respondents rated each resume from five (excellent application) to one (poor application or poorly qualified for the job). We discovered that implicit gender and/or ethnic bias seem not to exist in respondents aged 19-29, the majority of our sample. While the data shows that respondents did associate names with specific gender and ethnic groups, they did not use this information in rating the resumes. One rather disturbing result was noticed in the rating of Rebecca Johnson and Shaniqua Johnson. Of the ten resumes these two were rated as average; however, the eight other resumes were consistently rated higher. The small sample size of thirty-nine respondents may have resulted in this finding. This result suggests that further research is required; even though the data analysis overall suggests a lack of implicit gender and/or ethnic biases.
机译:本文基于研究人员对伦理学的兴趣以及将隐性偏见转移到人工智能(AI)应用程序上,提出了一项隐性偏见的预研究试验和初步探索。在AI应用程序知识库的开发中,有时会使用人类知识。结果,研究人员对确定人类是否确实具有内在的种族和/或性别偏见感兴趣。这项研究前的试验着重于是否存在隐性的人为偏见,因此可能会转移到AI应用程序中。该研究仅限于探索人类受访者中隐含的性别和种族偏见。每个受访者都要填写测试前和测试后的问卷,以收集样本数据,例如性别,年龄和种族背景。执行了一项玩具任务,要求受访者对虚构的求职者的简历进行非常相似的评分。受访者对每份简历的评分从五份(优秀申请)到一份(不良申请或资格差)。我们发现,在我们的大多数样本中,年龄在19-29岁之间的受访者似乎不存在内隐的性别和/或种族偏见。尽管数据显示受访者确实将姓名与特定的性别和种族相关联,但他们并未使用此信息对简历进行评分。丽贝卡·约翰逊(Rebecca Johnson)和沙尼夸·约翰逊(Shaniqua Johnson)的评级中发现了一个相当令人不安的结果。在十张简历中,这两张被评为平均。但是,其他八份履历表的评分始终较高。这一发现可能导致了39名受访者的小样本量。该结果表明需要进一步研究。即使数据分析总体表明缺乏内在的性别和/或种族偏见。

著录项

  • 作者

    Bat-Erdene, Erdenebileg.;

  • 作者单位

    University of Nebraska at Omaha.;

  • 授予单位 University of Nebraska at Omaha.;
  • 学科 Information science.;Information technology.
  • 学位 M.S.
  • 年度 2018
  • 页码 134 p.
  • 总页数 134
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号