Cryptology ePrint Archive: Report 2021/594

Zero Knowledge Contingent Payments for Trained Neural Networks

Zhelei Zhou and Xinlei Cao and Jian Liu and Bingsheng Zhang and Kui Ren

Abstract: Nowadays, neural networks have been widely used in many machine learning tasks. In practice, one might not have enough expertise to fine-tune a neural network model; therefore, it becomes increasingly popular to outsource the model training process to a machine learning expert. This activity brings out the needs of fair model exchange: if the seller sends the model first, the buyer might refuse to pay; if the buyer pays first, the seller might refuse to send the model or send an inferior model. In this work, we aim to address this problem so that neither the buyer nor the seller can deceive the other. We start from Zero Knowledge Contingent Payment (ZKCP), which is used for fair exchange of digital goods and payment over blockchain, and extend it to Zero Knowledge Contingent Model Payment (ZKCMP). We then instantiate our ZKCMP with two state-of-the-art NIZK proofs: zk-SNARKs and Libra. We also propose a random sampling technique to improve the efficiency of zk-SNARKs. We extensively conduct experiments to demonstrate the practicality of our proposal.

Category / Keywords: zero knowledge

Original Publication (with minor differences): Springer

Date: received 6 May 2021, last revised 7 Jan 2022

Contact author: zl_zhou at zju edu cn

Available format(s): PDF | BibTeX Citation

Version: 20220107:113525 (All versions of this report)

Short URL:

[ Cryptology ePrint archive ]