Privacy-Aware Rejection Sampling
Jordan Awan, Vinayak Rao; 24(74):1−32, 2023.
Abstract
In this study, we examine the potential vulnerabilities of differential privacy (DP) mechanisms to side-channel attacks, specifically timing attacks, and propose a privacy-aware rejection sampling algorithm to mitigate these risks. Although DP offers strong theoretical privacy guarantees, the runtime of DP mechanisms, such as MCMC or rejection sampling, can inadvertently leak private information. We analyze the additional privacy cost incurred by the runtime of a rejection sampler in terms of both $(\epsilon,\delta)$-DP and $f$-DP. Our findings reveal that unless the acceptance probability remains constant across databases, the runtime of a rejection sampler fails to satisfy $\epsilon$-DP for any $\epsilon$. We also identify a similar privacy breakdown in adaptive rejection samplers. To address these issues, we propose three modifications to the rejection sampling algorithm, each with different assumptions, to ensure that the runtime becomes independent of the data, thereby protecting against timing attacks. The modification with the weakest assumptions introduces a small increase in the privacy cost but results in an approximate sampler, while the other modifications yield perfect samplers. Furthermore, we apply these techniques to develop an adaptive rejection sampler specifically designed for log-Hölder densities, which also exhibits data-independent runtime. We provide several examples of DP mechanisms that meet the assumptions of our methods and can be implemented effectively using our samplers.
[abs]