Abstract: Differential privacy has emerged as the gold standard in private data analysis. However, there are some use-cases where it does not directly apply. In this talk, we will look at two such use-cases and the challenges that they pose. The first is privacy of language representations, where we offer sentence-level privacy and propose a new mechanism which uses public data to maintain high fidelity. The second is privacy of location traces, where we use Gaussian process priors to model correlations in location trajectory data, and offer privacy against an inferential adversary.
Joint work with Casey Meehan and Khalil Mrini