首页> 外文会议>IEEE Conference on Multimedia Information Processing and Retrieval >Distributed Layer-Partitioned Training for Privacy-Preserved Deep Learning
【24h】

Distributed Layer-Partitioned Training for Privacy-Preserved Deep Learning

机译:隐私保存深度学习的分布式层分区培训

获取原文

摘要

Deep Learning techniques have achieved remarkable results in many domains. Often, training deep learning models requires large datasets, which may require sensitive information to be uploaded to the cloud to accelerate training. To adequately protect sensitive information, we propose distributed layer-partitioned training with step-wise activation functions for privacy-preserving deep learning. Experimental results attest our method to be simple and effective.
机译:深度学习技术在许多域中取得了显着的结果。通常,培训深度学习模型需要大型数据集,这可能需要将敏感信息上传到云以加速培训。为了充分保护敏感信息,我们提出了分布式层分区培训,利用逐步激活函数进行隐私保留深度学习。实验结果证明了我们的方法简单有效。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号