{"id":159002,"date":"2010-01-01T00:00:00","date_gmt":"2010-01-01T00:00:00","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/msr-research-item\/differential-privacy-in-new-settings\/"},"modified":"2018-10-16T21:15:44","modified_gmt":"2018-10-17T04:15:44","slug":"differential-privacy-in-new-settings","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/differential-privacy-in-new-settings\/","title":{"rendered":"Differential Privacy in New Settings"},"content":{"rendered":"
\n

Differential privacy is a recent notion of privacy tailored to the problem of statistical disclosure control: how to release statistical information about a set of people without compromising the the privacy of any individual.<\/p>\n

We describe new work that extends differentially private data analysis beyond the traditional setting of a trusted curator operating, in perfect isolation, on a static dataset. We ask <\/p>\n