Paper 2021/594

Zero Knowledge Contingent Payments for Trained Neural Networks

Zhelei Zhou, Xinlei Cao, Jian Liu, Bingsheng Zhang, and Kui Ren


Nowadays, neural networks have been widely used in many machine learning tasks. In practice, one might not have enough expertise to fine-tune a neural network model; therefore, it becomes increasingly popular to outsource the model training process to a machine learning expert. This activity brings out the needs of fair model exchange: if the seller sends the model first, the buyer might refuse to pay; if the buyer pays first, the seller might refuse to send the model or send an inferior model. In this work, we aim to address this problem so that neither the buyer nor the seller can deceive the other. We start from Zero Knowledge Contingent Payment (ZKCP), which is used for fair exchange of digital goods and payment over blockchain, and extend it to Zero Knowledge Contingent Model Payment (ZKCMP). We then instantiate our ZKCMP with two state-of-the-art NIZK proofs: zk-SNARKs and Libra. We also propose a random sampling technique to improve the efficiency of zk-SNARKs. We extensively conduct experiments to demonstrate the practicality of our proposal.

Available format(s)
Publication info
Published elsewhere. MINOR revision.Springer
zero knowledge
Contact author(s)
zl_zhou @ zju edu cn
2022-01-07: last of 2 revisions
2021-05-10: received
See all versions
Short URL
Creative Commons Attribution


      author = {Zhelei Zhou and Xinlei Cao and Jian Liu and Bingsheng Zhang and Kui Ren},
      title = {Zero Knowledge Contingent Payments for Trained Neural Networks},
      howpublished = {Cryptology ePrint Archive, Paper 2021/594},
      year = {2021},
      doi = {10.1007/978-3-030-88428-4_31},
      note = {\url{}},
      url = {}
Note: In order to protect the privacy of readers, does not use cookies or embedded third party content.