Paper 2025/535
zkPyTorch: A Hierarchical Optimized Compiler for Zero-Knowledge Machine Learning
Abstract
As artificial intelligence (AI) becomes increasingly embedded in high-stakes applications such as healthcare, finance, and autonomous systems, ensuring the verifiability of AI computations without compromising sensitive data or proprietary models is crucial. Zero-knowledge machine learning (ZKML) leverages zero-knowledge proofs (ZKPs) to enable the verification of AI model outputs while preserving confidentiality. However, existing ZKML approaches require specialized cryptographic expertise, making them inaccessible to traditional AI developers. In this paper, we introduce ZKPyTorch, a compiler that seamlessly integrates ML frameworks like PyTorch with ZKP engines like Expander, simplifying the development of ZKML. ZKPyTorch automates the translation of ML operations into optimized ZKP circuits through three key components. First, a ZKP preprocessor converts models into structured computational graphs and injects necessary auxiliary information to facilitate proof generation. Second, a ZKP-friendly quantization module introduces an optimized quantization strategy that reduces computation bit-widths, enabling efficient ZKP execution within smaller finite fields such as M61. Third, a hierarchical ZKP circuit optimizer employs a multi-level optimization framework at model, operation, and circuit levels to improve proof generation efficiency. We demonstrate ZKPyTorch effectiveness through end-to-end case studies, successfully converting VGG-16 and Llama-3 models from PyTorch, a leading ML framework, into ZKP-compatible circuits recognizable by Expander, a state-of-the-art ZKP engine. Using Expander, we generate zero-knowledge proofs for these models, achieving proof generation for the VGG-16 model in 2.2 seconds per CIFAR-10 image for VGG-16 and 150 seconds per token for Llama-3 inference, improving the practical adoption of ZKML.
Metadata
- Available format(s)
-
PDF
- Category
- Applications
- Publication info
- Preprint.
- Keywords
- zkmlzero-knowledge proofsimplementationcompiler
- Contact author(s)
-
tc @ polyhedra network
tao @ polyhedra network
zhiyong @ polyhedra network
siqi @ polyhedra network
zhenfei @ polyhedra network
abner @ polyhedra network
dawnsong @ gmail com
jhzhang @ nus edu sg - History
- 2025-03-23: approved
- 2025-03-22: received
- See all versions
- Short URL
- https://ia.cr/2025/535
- License
-
CC BY-SA
BibTeX
@misc{cryptoeprint:2025/535, author = {Tiancheng Xie and Tao Lu and Zhiyong Fang and Siqi Wang and Zhenfei Zhang and Yongzheng Jia and Dawn Song and Jiaheng Zhang}, title = {{zkPyTorch}: A Hierarchical Optimized Compiler for Zero-Knowledge Machine Learning}, howpublished = {Cryptology {ePrint} Archive, Paper 2025/535}, year = {2025}, url = {https://eprint.iacr.org/2025/535} }