Abstract

We give new upper and lower bounds on the minimax sample complexity of differentially private mean estimation of distributions with bounded $k$-th moments. Roughly speaking, in the univariate case, we show that $n = \Thetałeft(1\alpha^2 + 1\alpha^kk-1\varepsilon\right)$ samples are necessary and sufficient to estimate the mean to $\alpha$-accuracy under $\varepsilon$-differential privacy, or any of its common relaxations. This result demonstrates a qualitatively different behavior compared to estimation absent privacy constraints, for which the sample complexity is identical for all $k 2$. We also give algorithms for the multivariate setting whose sample complexity is a factor of $O(d)$ larger than the univariate case.

Description

[2002.09464] Private Mean Estimation of Heavy-Tailed Distributions

Links and resources

Tags

community

  • @kirk86
  • @dblp
@kirk86's tags highlighted