{"id":145478,"date":"2006-03-01T00:00:00","date_gmt":"2006-03-01T00:00:00","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/msr-research-item\/calibrating-noise-to-sensitivity-in-private-data-analysis\/"},"modified":"2018-10-16T22:32:05","modified_gmt":"2018-10-17T05:32:05","slug":"calibrating-noise-to-sensitivity-in-private-data-analysis","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/calibrating-noise-to-sensitivity-in-private-data-analysis\/","title":{"rendered":"Calibrating Noise to Sensitivity in Private Data Analysis"},"content":{"rendered":"

We continue a line of research initiated in [10,11] on privacy-preserving statistical databases. Consider a trusted server that holds a database of sensitive information. Given a query function f<\/em> mapping databases to reals, the so-called true answer<\/em> is the result of applying f<\/em> to the database. To protect privacy, the true answer is perturbed by the addition of random noise generated according to a carefully chosen distribution, and this response, the true answer plus noise, is returned to the user.<\/p>\n

Previous work focused on the case of noisy sums, in which f<\/em> = \u2211 i<\/em> <\/sub> g<\/em>(x<\/em> i<\/em> <\/sub>), where x<\/em> i<\/em> <\/sub> denotes the i<\/em>th row of the database and g<\/em> maps database rows to [0,1]. We extend the study to general functions f<\/em>, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity<\/em> of the function f<\/em>. Roughly speaking, this is the amount that any single argument to f<\/em> can change its output. The new analysis shows that for several particular applications substantially less noise is needed than was previously understood to be the case.<\/p>\n

The first step is a very clean characterization of privacy in terms of indistinguishability of transcripts. Additionally, we obtain separation results showing the increased value of interactive sanitization mechanisms over non-interactive.<\/p>\n","protected":false},"excerpt":{"rendered":"

We continue a line of research initiated in [10,11] on privacy-preserving statistical databases. Consider a trusted server that holds a database of sensitive information. Given a query function f mapping databases to reals, the so-called true answer is the result of applying f to the database. To protect privacy, the true answer is perturbed by […]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"msr-content-type":[3],"msr-research-highlight":[],"research-area":[13563,13558],"msr-publication-type":[193716],"msr-product-type":[],"msr-focus-area":[],"msr-platform":[],"msr-download-source":[],"msr-locale":[268875],"msr-post-option":[],"msr-field-of-study":[],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-145478","msr-research-item","type-msr-research-item","status-publish","hentry","msr-research-area-data-platform-analytics","msr-research-area-security-privacy-cryptography","msr-locale-en_us"],"msr_publishername":"Springer","msr_edition":"Third Theory of Cryptography Conference (TCC 2006)","msr_affiliation":"","msr_published_date":"2006-03-01","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"","msr_pages_string":"265-284","msr_chapter":"","msr_isbn":"3-540-32731-2","msr_journal":"","msr_volume":"3876","msr_number":"","msr_editors":"","msr_series":"Lecture Notes in Computer Science","msr_issue":"","msr_organization":"","msr_how_published":"","msr_notes":"","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":1,"msr_main_download":"447165","msr_publicationurl":"http:\/\/dx.doi.org\/10.1007\/11681878_14","msr_doi":"","msr_publication_uploader":[{"type":"file","title":"dmns06","viewUrl":"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2006\/03\/dmns06.pdf","id":447165,"label_id":0},{"type":"url","title":"http:\/\/dx.doi.org\/10.1007\/11681878_14","viewUrl":false,"id":false,"label_id":0}],"msr_related_uploader":"","msr_attachments":[{"id":0,"url":"http:\/\/dx.doi.org\/10.1007\/11681878_14"}],"msr-author-ordering":[{"type":"user_nicename","value":"dwork","user_id":31702,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=dwork"},{"type":"user_nicename","value":"mcsherry","user_id":32863,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=mcsherry"},{"type":"text","value":"Kobbi Nissim","user_id":0,"rest_url":false},{"type":"text","value":"Adam Smith","user_id":0,"rest_url":false}],"msr_impact_theme":[],"msr_research_lab":[199565],"msr_event":[],"msr_group":[],"msr_project":[169518],"publication":[],"video":[],"download":[],"msr_publication_type":"inproceedings","related_content":{"projects":[{"ID":169518,"post_title":"Database Privacy","post_name":"database-privacy","post_type":"msr-project","post_date":"2003-11-24 13:44:35","post_modified":"2020-03-12 16:39:21","post_status":"publish","permalink":"https:\/\/www.microsoft.com\/en-us\/research\/project\/database-privacy\/","post_excerpt":"Overview The problem of statistical disclosure control\u2014revealing accurate statistics about a population while preserving the privacy of individuals\u2014has a venerable history. An extensive literature spans multiple disciplines: statistics, theoretical computer science, security, and databases.\u00a0 Nevertheless, despite this extensive literature, \u00abprivacy breaches\u00bb are common, both in the literature and in practice, even when security and data integrity are not compromised. This project revisits private data analysis from the perspective of modern cryptography.\u00a0 We address many previous…","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/169518"}]}}]},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/145478"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":3,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/145478\/revisions"}],"predecessor-version":[{"id":447171,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/145478\/revisions\/447171"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=145478"}],"wp:term":[{"taxonomy":"msr-content-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-content-type?post=145478"},{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=145478"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=145478"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=145478"},{"taxonomy":"msr-product-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-product-type?post=145478"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=145478"},{"taxonomy":"msr-platform","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-platform?post=145478"},{"taxonomy":"msr-download-source","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-download-source?post=145478"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=145478"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=145478"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=145478"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=145478"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=145478"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=145478"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=145478"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}