Budget Sharing for Multi-Analyst Differential Privacy
Large organization that collect data about populations (like the US Census Bureau) release summary statistics about these populations to satisfy the data needed by multiple resource allocation and policy making problems. These organizations are also legally required to ensure privacy of the individuals, and hence, differential privacy (DP) is a perfect solution. However, most differentially private mechanisms are designed to answer a single set of queries and optimize the total accuracy across the entire set. In reality, however, the multiple stakeholders that need to be satisfied by the data release often have competing goals and priorities, and current differentially private query answering mechanisms provide no means to capture these.
In this work, we initiate the study into the problem of DP query answering across multiple analysts. To capture the competing goals and priorities of multiple analysts, we formulate three desiderata that any mechanism must satisfy in this setting while still optimizing for overall error – Sharing Incentive, Non-Interference, and Workload Adaptivity. We demonstrate how applying existing DP query answering mechanisms to the multi-analyst settings fail to satisfy all three desiderata simultaneously. We present novel DP algorithms that provably satisfy all our desiderata and empirically demonstrate that they incur low error on realistic tasks.
David Pujol is a third year Duke CS PhD student advised by Ashwin Machanavajjhala. His interests lie in differential privacy and algorithmic fairness with a focus on designing fair differentially private systems