首页> 外文期刊>Information Forensics and Security, IEEE Transactions on >Individual Differential Privacy: A Utility-Preserving Formulation of Differential Privacy Guarantees
【24h】

Individual Differential Privacy: A Utility-Preserving Formulation of Differential Privacy Guarantees

机译:个人差异性隐私:差异性隐私保证的实用程序

获取原文
获取原文并翻译 | 示例
       

摘要

Differential privacy is a popular privacy model within the research community because of the strong privacy guarantee it offers, namely that the presence or absence of any individual in a data set does not significantly influence the results of analyses on the data set. However, enforcing this strict guarantee in practice significantly distorts data and/or limits data uses, thus diminishing the analytical utility of the differentially private results. In an attempt to address this shortcoming, several relaxations of differential privacy have been proposed that trade off privacy guarantees for improved data utility. In this paper, we argue that the standard formalization of differential privacy is stricter than required by the intuitive privacy guarantee it seeks. In particular, the standard formalization requires indistinguishability of results between any pair of neighbor data sets, while indistinguishability between the actual data set and its neighbor data sets should be enough. This limits the data controller’s ability to adjust the level of protection to the actual data, hence resulting in significant accuracy loss. In this respect, we propose individual differential privacy, an alternative differential privacy notion that offers the same privacy guarantees as standard differential privacy to individuals (even though not to groups of individuals). This new notion allows the data controller to adjust the distortion to the actual data set, which results in less distortion and more analytical accuracy. We propose several mechanisms to attain individual differential privacy and we compare the new notion against standard differential privacy in terms of the accuracy of the analytical results.
机译:差异隐私是研究社区中一种流行的隐私模型,因为它提供了强大的隐私保证,即,数据集中是否存在任何个人都不会显着影响数据集的分析结果。但是,在实践中实施这种严格的保证会严重扭曲数据和/或限制数据使用,从而削弱了差分私有结果的分析效用。为了解决该缺点,已经提出了对差分隐私的一些放松,这些折衷权衡了隐私保证以改善数据的实用性。在本文中,我们认为差异隐私的标准形式要严格于其寻求的直观隐私保证所要求的。特别是,标准形式化要求任何一对相邻数据集之间的结果不可区分,而实际数据集与其相邻数据集之间的不可区分性就足够了。这限制了数据控制器将保护级别调整为实际数据的能力,从而导致严重的准确性损失。在这方面,我们提出了个人差异隐私,这是另一种差异隐私概念,可为个人(即使不是针对个人群体)提供与标准差异隐私相同的隐私保证。这个新概念允许数据控制器将失真调整为实际数据集,从而减少失真并提高分析精度。我们提出了几种获得个体差异隐私的机制,并且就分析结果的准确性而言,我们将新概念与标准差异隐私进行了比较。

著录项

  • 来源
  • 作者单位

    Department of Computer Engineering and Mathematics, UNESCO Chair in Data Privacy, Universitat Rovira i Virgili, Tarragona, Catalonia;

    Department of Computer Engineering and Mathematics, UNESCO Chair in Data Privacy, Universitat Rovira i Virgili, Tarragona, Catalonia;

    Department of Computer Engineering and Mathematics, UNESCO Chair in Data Privacy, Universitat Rovira i Virgili, Tarragona, Catalonia;

    Estudis d’Informàtica, Multimèdia i Telecomunicació, Internet Interdisciplinary Institute, Universitat Oberta de Catalunya, Castelldefels, Catalonia;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Privacy; Sensitivity; Standards; Distortion; Data models; Data protection;

    机译:隐私;敏感度;标准;失真;数据模型;数据保护;
  • 入库时间 2022-08-17 13:06:00

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号