Private and communication-efficient algorithms for entropy estimation

Gecia Bravo-Hermsdorff
Robert Busa-Fekete
Mohammad Ghavamzadeh

Abstract

Modern statistical estimation is often performed in a distributed setting where each sample
belongs to a single user who shares their data with a central server. Users are typically concerned with preserving the privacy of their samples, and also with minimizing the amount of
data they must transmit to the server. We give improved private and communication-efficient
algorithms for estimating several popular measures of the entropy of a distribution. All of our
algorithms have constant communication cost and satisfy local differential privacy. For a joint
distribution over many variables whose conditional independence is given by a tree, we describe
algorithms for estimating Shannon entropy that require a number of samples that is linear in
the number of variables, compared to the quadratic sample complexity of prior work. We also
describe an algorithm for estimating Gini entropy whose sample complexity has no dependence on the support size of the distribution and can be implemented using a single round of concurrent communication between the users and the server. In contrast, the previously best-known algorithm has high communication cost and requires the server to facilitate interaction between the users. Finally, we describe an algorithm for estimating collision entropy that generalizes the best known algorithm to the private and communication-efficient setting.