首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing >Witchcraft: Efficient PGD Attacks with Random Step Size
【24h】

Witchcraft: Efficient PGD Attacks with Random Step Size

机译:Witchcraft:随机步长有效的PGD攻击

获取原文

摘要

State-of-the-art adversarial attacks on neural networks use expensive iterative methods and numerous random restarts from different initial points. Iterative FGSM-based methods without restarts trade off performance for computational efficiency because they do not adequately explore the image space and are highly sensitive to the choice of step size. We propose a variant of Projected Gradient Descent (PGD) that uses a random step size to improve performance without resorting to expensive random restarts. Our method, Wide Iterative Stochastic crafting (WITCHcraft), achieves results superior to the classical PGD attack on the CIFAR-10 and MNIST data sets but without additional computational cost. This simple modification of PGD makes crafting attacks more economical, which is important in situations like adversarial training where attacks need to be crafted in real time.
机译:对神经网络的最新对抗攻击使用昂贵的迭代方法,并从不同的起始点进行大量随机重启。基于迭代FGSM的方法没有重新启动,因此会在性能与计算效率之间进行权衡,因为它们无法充分利用图像空间,并且对步长的选择高度敏感。我们提出了投影梯度下降(PGD)的变体,该变体使用随机步长来提高性能,而无需诉诸昂贵的随机重启。我们的方法,即宽迭代随机制作(WITCHcraft),可在CIFAR-10和MNIST数据集上获得优于传统PGD攻击的结果,而无需额外的计算成本。对PGD的这种简单修改使制作攻击更为经济,这在对抗训练等需要实时制作攻击的情况下非常重要。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号