Paper 2024/535
NodeGuard: A Highly Efficient Two-Party Computation Framework for Training Large-Scale Gradient Boosting Decision Tree
Abstract
The Gradient Boosting Decision Tree (GBDT) is a well-known machine learning algorithm, which achieves high performance and outstanding interpretability in real-world scenes such as fraud detection, online marketing and risk management. Meanwhile, two data owners can jointly train a GBDT model without disclosing their private dataset by executing secure Multi-Party Computation (MPC) protocols. In this work, we propose NodeGuard, a highly efficient two party computation (2PC) framework for large-scale GBDT training and inference. NodeGuard guarantees that no sensitive intermediate results are leaked in the training and inference. The efficiency advantage of NodeGuard is achieved by applying a novel keyed bucket aggregation protocol, which optimizes the communication and computation complexity globally in the training. Additionally, we introduce a probabilistic approximate division protocol with an optimization for re-scaling, when the divisor is publicly known. Finally, we compare NodeGuard to state-of-the-art frameworks, and we show that NodeGuard is extremely efficient. It can improve the privacy preserving GBDT training performance by a factor of 5.0 to 131 in LAN and 2.7 to 457 in WAN.
Metadata
- Available format(s)
- Category
- Cryptographic protocols
- Publication info
- Published elsewhere. DLSP 2024
- Keywords
- MPCTwo-party computationGradient boosting decision treeSecure bucket aggregation
- Contact author(s)
-
yufan jiang @ partner kit edu
yong li1 @ huawei com - History
- 2024-04-06: approved
- 2024-04-05: received
- See all versions
- Short URL
- https://ia.cr/2024/535
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2024/535, author = {Tianxiang Dai and Yufan Jiang and Yong Li and Fei Mei}, title = {{NodeGuard}: A Highly Efficient Two-Party Computation Framework for Training Large-Scale Gradient Boosting Decision Tree}, howpublished = {Cryptology {ePrint} Archive, Paper 2024/535}, year = {2024}, url = {https://eprint.iacr.org/2024/535} }