首页> 外文会议>Annual conference on Neural Information Processing Systems >Near-optimal Differentially Private Principal Components
【24h】

Near-optimal Differentially Private Principal Components

机译:接近最优的微分专用主成分

获取原文
获取外文期刊封面目录资料

摘要

Principal components analysis (PCA) is a standard tool for identifying good low-dimensional approximations to data sets in high dimension. Many current data sets of interest contain private or sensitive information about individuals. Algorithms which operate on such data should be sensitive to the privacy risks in publishing their outputs. Differential privacy is a framework for developing tradeoffs between privacy and the utility of these outputs. In this paper we investigate the theory and empirical performance of differentially private approximations to PCA and propose a new method which explicitly optimizes the utility of the output. We demonstrate that on real data, there is a large performance gap between the existing method and our method. We show that the sample complexity for the two procedures differs in the scaling with the data dimension, and that our method is nearly optimal in terms of this scaling.
机译:主成分分析(PCA)是用于识别高维数据集的良好低维近似值的标准工具。当前感兴趣的许多数据集包含有关个人的私人或敏感信息。对此类数据进行操作的算法在发布其输出时应注意隐私风险。差异隐私是在隐私和这些输出的效用之间进行权衡的框架。在本文中,我们研究了PCA的差分私有近似的理论和经验性能,并提出了一种明确优化输出效用的新方法。我们证明,在真实数据上,现有方法与我们的方法之间存在很大的性能差距。我们表明,这两个过程的样本复杂度随数据维度的缩放而不同,并且就此缩放而言,我们的方法几乎是最佳的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号