A new model for differential privacy, termed trust graph DP (TGDP), is proposed to accommodate varying levels of trust among users in data-sharing scenarios. This model interpolates between central and local differential privacy, allowing for more nuanced privacy controls while providing algorithms and error bounds for aggregation tasks based on user relationships. The approach has implications for federated learning and other applications requiring privacy-preserving data sharing.
Novel algorithms have been developed to enhance user privacy in data sharing through differentially private partition selection, enabling the safe release of meaningful data subsets while preserving individual privacy. The MaxAdaptiveDegree (MAD) algorithm improves the utility of data outputs by reallocating weight among items based on their popularity, achieving state-of-the-art results on massive datasets, including the Common Crawl dataset. Open-sourcing this algorithm aims to foster collaboration and innovation in the research community.