Skip to main navigation Skip to search Skip to main content

Privately Learning Smooth Distributions on the Hypercube by Projections

Research output: Contribution to journalConference articlepeer-review

Abstract

Fueled by the ever-increasing need for statistics that guarantee the privacy of their training sets, this article studies the centrally-private estimation of Sobolev-smooth densities of probability over the hypercube in dimension d. The contributions of this article are two-fold: Firstly, it generalizes the one-dimensional results of (Lalanne et al., 2023b) to non-integer levels of smoothness and to a high-dimensional setting, which is important for two reasons: it is more suited for modern learning tasks, and it allows understanding the relations between privacy, dimensionality and smoothness, which is a central question with differential privacy. Secondly, this article presents a private strategy of estimation that is data-driven (usually referred to as adaptive in Statistics) in order to privately choose an estimator that achieves a good bias-variance trade-off among a finite family of private projection estimators without prior knowledge of the ground-truth smoothness β. This is achieved by adapting the Lepskii method for private selection, by adding a new penalization term that makes the estimation privacy-aware.

Original languageEnglish
Pages (from-to)25936-25975
Number of pages40
JournalProceedings of Machine Learning Research
Volume235
Publication statusPublished - 1 Jan 2024
Externally publishedYes
Event41st International Conference on Machine Learning, ICML 2024 - Vienna, Austria
Duration: 21 Jul 202427 Jul 2024

Fingerprint

Dive into the research topics of 'Privately Learning Smooth Distributions on the Hypercube by Projections'. Together they form a unique fingerprint.

Cite this