首页> 外文会议>2012 Annual SRII Global Conference. >Opportunistic Adversaries: On Imminent Threats to Learning-Based Business Automation
【24h】

Opportunistic Adversaries: On Imminent Threats to Learning-Based Business Automation

机译:机会主义的对手:基于学习的业务自动化的迫在眉睫的威胁

获取原文
获取原文并翻译 | 示例

摘要

False positives and negatives are inevitable in real-world classification problems. In general, machine-learning-based business process automation is still viable with reduced classification accuracy due to such false decisions, thanks to business models that replace human decision processes with automated decision processes covering the costs of introducing automation and the losses from rare mistakes by the automation with the profits from relatively large savings in human-factor costs. However, under certain conditions, it is possible for attackers to outsmart a classifier at a reasonable cost and thus destroy the business model that the learner system depends on. Attackers may eventually detect the misclassification cases they can benefit from and try to create similar inputs that will be misclassified by the unaware learner system. We call adversaries of this type "opportunistic adversaries". This paper specifies the environmental patterns that can expose vulnerabilities to opportunistic adversaries and presents some likely business scenarios for these threats. Then we propose a countermeasure algorithm to detect such attacks based on change detection in the post-classification data distributions. Experimental results show that our algorithm has higher detection accuracy than other approaches based on outlier detection or change-point detection.
机译:在现实世界中的分类问题中,假阳性和阴性都是不可避免的。通常,由于这种错误的决策,基于机器学习的业务流程自动化仍然可行,并且分类精度降低,这要归功于业务模型将人工决策流程替换为自动决策流程,从而弥补了引入自动化的成本以及因错误而造成的罕见损失。相对而言,人为因素成本的节省可带来利润。但是,在某些情况下,攻击者有可能以合理的成本超过分类器,从而破坏学习者系统所依赖的业务模型。攻击者最终可能会发现他们可以从中受益的错误分类案例,并尝试创建类似的输入,而这些输入将被不知道的学习者系统错误分类。我们称这类对手为“机会主义对手”。本文指定了可以向机会主义对手暴露漏洞的环境模式,并提出了应对这些威胁的一些可能的业务方案。然后,提出了一种基于分类后数据分布中的变化检测来检测此类攻击的对策算法。实验结果表明,与基于异常值检测或变化点检测的其他方法相比,该算法具有更高的检测精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号