Paper 2017/695

Updatable Tokenization: Formal Definitions and Provably Secure Constructions

Christian Cachin, Jan Camenisch, Eduarda Freire-Stoegbuchner, and Anja Lehmann


Tokenization is the process of consistently replacing sensitive elements, such as credit cards numbers, with non-sensitive surrogate values. As tokenization is mandated for any organization storing credit card data, many practical solutions have been introduced and are in commercial operation today. However, all existing solutions are static yet, i.e., they do not allow for efficient updates of the cryptographic keys while maintaining the consistency of the tokens. This lack of updatability is a burden for most practical deployments, as cryptographic keys must also be re-keyed periodically for ensuring continued security. This paper introduces a model for updatable tokenization with key evolution, in which a key exposure does not disclose relations among tokenized data in the past, and where the updates to the tokenized data set can be made by an untrusted entity and preserve the consistency of the data. We formally define the desired security properties guaranteeing unlinkability of tokens among different time epochs and one-wayness of the tokenization process. Moreover, we construct two highly efficient updatable tokenization schemes and prove them to achieve our security notions.

Available format(s)
Cryptographic protocols
Publication info
Published elsewhere. MAJOR revision.Financial Cryptography and Data Security 2017
Contact author(s)
anj @ zurich ibm com
2017-07-21: received
Short URL
Creative Commons Attribution


      author = {Christian Cachin and Jan Camenisch and Eduarda Freire-Stoegbuchner and Anja Lehmann},
      title = {Updatable Tokenization: Formal Definitions and Provably Secure Constructions},
      howpublished = {Cryptology ePrint Archive, Paper 2017/695},
      year = {2017},
      note = {\url{}},
      url = {}
Note: In order to protect the privacy of readers, does not use cookies or embedded third party content.