We explain how our theory captures the natural expectation of privacy in the advertising setting and avoids the limitations of existing alternatives. However, although we can construct artificial databases which satisfy our notion of privacy together with reasonable utility, we do not have evidence that real world databases can be sanitized to preserve reasonable utility. In fact we offer real world evidence that adherence to our notion of privacy almost completely destroys utility. Our results suggest that a significant theoretical advance or a change in infrastructure is needed in order to obtain rigorous privacy guarantees in the digital advertising ecosystem.
Category / Keywords: foundations / Privacy, Utility, Data sharing, Targeted advertisements Date: received 3 Jul 2017, last revised 20 Jul 2017 Contact author: arnabr at gmail com Available format(s): PDF | BibTeX Citation Note: Added more references. Version: 20170720:191539 (All versions of this report) Short URL: ia.cr/2017/658