Abstract
We analyze to what extent final users can infer information about the level of protection of their data when the data obfuscation mechanism is a priori unknown to them (the so-called;black-box; scenario). In particular, we delve into the investigation of two notions of local differential privacy (LDP), namely -LDP and Rényi LDP. On one hand, we prove that, without any assumption on the underlying distributions, it is not possible to have an algorithm able to infer the level of data protection with provable guarantees. On the other hand, we demonstrate that, under reasonable assumptions (namely, Lipschitzness of the involved densities on a closed interval), such guarantees exist and can be achieved by a simple histogram-based estimator.
| Original language | English |
|---|---|
| Pages (from-to) | 219-224 |
| Number of pages | 6 |
| Journal | CEUR Workshop Proceedings |
| Volume | 3587 |
| Publication status | Published - 1 Jan 2023 |
| Externally published | Yes |
| Event | 24th Italian Conference on Theoretical Computer Science, ICTCS 2023 - Palermo, Italy Duration: 13 Sept 2023 → 15 Sept 2023 |