Paper 2021/594
ZK Contingent Payments for Trained Neural Networks
Zhelei Zhou and Xinlei Cao and Jian Liu and Bingsheng Zhang and Kui Ren
Abstract
Nowadays, neural networks have been widely used in many machine learning tasks. In practice, one might not have enough expertise to fine-tune a neural network model; therefore, it becomes increasingly popular to outsource the model training process to a machine learning expert. This activity brings out the needs of fair model exchange: if the seller sends the model first, the buyer might refuse to pay; if the buyer pays first, the seller might refuse to send the model or send an inferior model. In this work, we aim to address this problem so that neither the buyer nor the seller can deceive the other. We start from Zero Knowledge Contingent Payment (ZKCP), which is used for fair exchange of digital goods and payment over blockchain, and extend it to Zero Knowledge Contingent Model Payment (ZKCMP). We then instantiate our ZKCMP with two state-of-the-art NIZK proofs: zk-SNARKs and Libra. We also propose a random sampling technique to improve the efficiency of zk-SNARKs. We extensively conduct experiments to demonstrate the practicality of our proposal.
Metadata
- Available format(s)
- Publication info
- Preprint. MINOR revision.
- Keywords
- zero knowledge
- Contact author(s)
- zl_zhou @ zju edu cn,xinle @ zju edu cn,bingsheng @ zju edu cn,liujian2411 @ zju edu cn
- History
- 2022-01-07: last of 2 revisions
- 2021-05-10: received
- See all versions
- Short URL
- https://ia.cr/2021/594
- License
-
CC BY