We consider the problem of "privacy-amplification by subsampling" under the Renyi Differential Privacy (RDP) framework (Mironov, 2017). This is the main workhorse underlying the moments accountant approach for differentially private deep learning (Abadi et al., 2016). Complementing a recent result on this problem that deals with "Sampling without Replacement" (Wang et al., 2019), we address the "Poisson subsampling" scheme which selects each data point independently with probability γ. The seemingly minor change allows us to more precisely characterize the RDP of M · PoissonSample. In particular, we prove an exact analytical formula for the case when M is the Gaussian mechanism or the Laplace mechanism. For general M, we prove an upper bound that is optimal up to an additive constant of log(3)/(α - 1) and a multiplicative factor of 1 + O(γ). Our result is the first of its kind that makes the moments accountant technique (Abadi et al., 2016) efficient and generally applicable for all Poisson-subsampled mechanisms. An open source implementation is available at https://github.com/yuxiangw/autodp.
展开▼