Cryptology ePrint Archive: Report 2021/755

Tetrad: Actively Secure 4PC for Secure Training and Inference

Nishat Koti and Arpita Patra and Rahul Rachuri and Ajith Suresh

Abstract: Mixing arithmetic and boolean circuits to perform privacy-preserving machine learning has become increasingly popular. Towards this, we propose a framework for the case of four parties with at most one active corruption called Tetrad.

Tetrad works over rings and supports two levels of security, fairness and robustness. The fair multiplication protocol costs 5 ring elements, improving over the state-of-the-art Trident (Chaudhari et al. NDSS'20). A key feature of Tetrad is that robustness comes for free over fair protocols. Other highlights across the two variants include (a) probabilistic truncation without overhead, (b) multi-input multiplication protocols, and (c) conversion protocols to switch between the computational domains, along with a tailor-made garbled circuit approach.

Benchmarking of Tetrad for both training and inference is conducted over deep neural networks such as LeNet and VGG16. We found that Tetrad is up to 4 times faster in ML training and up to 5 times faster in ML inference. Tetrad is also lightweight in terms of deployment cost, costing up to 6 times less than Trident.

Category / Keywords: cryptographic protocols / 4PC, fair, robust, multi-party computation, privacy preserving machine learning

Original Publication (with major differences): Network and Distributed System Security Symposium (NDSS) 2022
DOI:
10.14722/ndss.2022.24058

Date: received 5 Jun 2021, last revised 3 Jan 2022

Contact author: kotis at iisc ac in, arpita at iisc ac in, rachuri at cs au dk, ajith at iisc ac in

Available format(s): PDF | BibTeX Citation

Note: This article is the full and extended version of an article to appear in the Network and Distributed System Security Symposium (NDSS) 2022.

Version: 20220103:170953 (All versions of this report)

Short URL: ia.cr/2021/755


[ Cryptology ePrint archive ]