Cryptology ePrint Archive: Report 2017/658

Privacy for Targeted Advertising

Avradip Mandal and John Mitchell and Hart Montgomery and Arnab Roy

Abstract: In the past two decades, targeted online advertising has led to massive data collection, aggregation, and exchange. This infrastructure raises significant privacy concerns. While several prominent theories of data privacy have been proposed over the same period of time, these notions have limited application to advertising ecosystems. Differential privacy, the most robust of them, is inherently inapplicable to queries about particular individuals in the dataset. We therefore formulate a new definition of privacy for accessing private information about unknown individuals identified by some random token. Unlike most current privacy definitions, our's takes probabilistic prior information into account and is intended to reflect the use of aggregated web information for targeted advertising.

We explain how our theory captures the natural expectation of privacy in the advertising setting and avoids the limitations of existing alternatives. However, although we can construct artificial databases which satisfy our notion of privacy together with reasonable utility, we do not have evidence that real world databases can be sanitized to preserve reasonable utility. In fact we offer real world evidence that adherence to our notion of privacy almost completely destroys utility. Our results suggest that a significant theoretical advance or a change in infrastructure is needed in order to obtain rigorous privacy guarantees in the digital advertising ecosystem.

Category / Keywords: foundations / Privacy, Utility, Data sharing, Targeted advertisements

Date: received 3 Jul 2017, last revised 20 Jul 2017

Contact author: arnabr at gmail com

Available format(s): PDF | BibTeX Citation

Note: Added more references.

Version: 20170720:191539 (All versions of this report)

Short URL:

[ Cryptology ePrint archive ]