Preserving differential privacy under finite-precision semantics

Research output: Contribution to journalArticlepeer-review

Abstract

The approximation introduced by finite-precision representation of continuous data can induce arbitrarily large information leaks even when the computation using exact semantics is secure. Such leakage can thus undermine design efforts aimed at protecting sensitive information. We focus here on differential privacy, an approach to privacy that emerged from the area of statistical databases and is now widely applied also in other domains. In this approach, privacy is protected by adding noise to the values correlated to the private data. The typical mechanisms used to achieve differential privacy have been proved correct in the ideal case in which computations are made using infinite-precision semantics. In this paper, we analyze the situation at the implementation level, where the semantics is necessarily limited by finite precision, i.e., the representation of real numbers and the operations on them are rounded according to some level of precision. We show that in general there are violations of the differential privacy property, and we study the conditions under which we can still guarantee a limited (but, arguably, acceptable) variant of the property, under only a minor degradation of the privacy level.

Original languageEnglish
Pages (from-to)92-108
Number of pages17
JournalTheoretical Computer Science
Volume655
DOIs
Publication statusPublished - 6 Dec 2016

Keywords

  • Differential privacy
  • Finite-precision arithmetic
  • Robustness to errors

Fingerprint

Dive into the research topics of 'Preserving differential privacy under finite-precision semantics'. Together they form a unique fingerprint.

Cite this