首页> 外文会议>UNESCO chair in data privacy international conference on privacy in statistical databases >On-Average KL-Privacy and Its Equivalence to Generalization for Max-Entropy Mechanisms
【24h】

On-Average KL-Privacy and Its Equivalence to Generalization for Max-Entropy Mechanisms

机译:平均KL隐私及其对最大熵机制的推广

获取原文

摘要

We define On-Average KL-Privacy and present its properties and connections to differential privacy, generalization and information-theoretic quantities including max-information and mutual information. The new definition significantly weakens differential privacy, while preserving its minimal design features such as composition over small group and multiple queries as well as closeness to post-processing. Moreover, we show that On-Average KL-Privacy is equivalent to generalization for a large class of commonly-used tools in statistics and machine learning that samples from Gibbs distributions-a class of distributions that arises naturally from the maximum entropy principle. In addition, a byproduct of our analysis yields a lower bound for generalization error in terms of mutual information which reveals an interesting interplay with known upper bounds that use the same quantity.
机译:我们定义平均KL隐私,并介绍其属性和与差异性隐私,概括性和包括最大信息量和互信息量在内的信息理论量的联系。新定义显着削弱了差异性隐私,同时保留了其最小的设计功能,例如在小组和多个查询中的组合以及对后处理的紧密性。此外,我们表明,平均KL隐私等效于统计和机器学习中大量常用工具的概括,这些工具从Gibbs分布中采样,而Gibbs分布是一类由最大熵原理自然产生的分布。此外,我们的分析副产品在互信息方面产生了泛化误差的下限,这揭示了与使用相同数量的已知上限之间的有趣相互作用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号