On the (Im)Possibility of Estimating Various Notions of Differential Privacy (short paper)

Research output: Contribution to journalConference articlepeer-review

Abstract

We analyze to what extent final users can infer information about the level of protection of their data when the data obfuscation mechanism is a priori unknown to them (the so-called;black-box; scenario). In particular, we delve into the investigation of two notions of local differential privacy (LDP), namely -LDP and Rényi LDP. On one hand, we prove that, without any assumption on the underlying distributions, it is not possible to have an algorithm able to infer the level of data protection with provable guarantees. On the other hand, we demonstrate that, under reasonable assumptions (namely, Lipschitzness of the involved densities on a closed interval), such guarantees exist and can be achieved by a simple histogram-based estimator.

Original languageEnglish
Pages (from-to)219-224
Number of pages6
JournalCEUR Workshop Proceedings
Volume3587
Publication statusPublished - 1 Jan 2023
Externally publishedYes
Event24th Italian Conference on Theoretical Computer Science, ICTCS 2023 - Palermo, Italy
Duration: 13 Sept 202315 Sept 2023

Fingerprint

Dive into the research topics of 'On the (Im)Possibility of Estimating Various Notions of Differential Privacy (short paper)'. Together they form a unique fingerprint.

Cite this