Paper 2024/535

NodeGuard: A Highly Efficient Two-Party Computation Framework for Training Large-Scale Gradient Boosting Decision Tree

Tianxiang Dai, Huawei Technologies (Germany)
Yufan Jiang, Huawei Technologies (Germany), Karlsruhe Institute of Technology
Yong Li, Huawei Technologies (Germany)
Fei Mei, Huawei Technologies (Germany)

The Gradient Boosting Decision Tree (GBDT) is a well-known machine learning algorithm, which achieves high performance and outstanding interpretability in real-world scenes such as fraud detection, online marketing and risk management. Meanwhile, two data owners can jointly train a GBDT model without disclosing their private dataset by executing secure Multi-Party Computation (MPC) protocols. In this work, we propose NodeGuard, a highly efficient two party computation (2PC) framework for large-scale GBDT training and inference. NodeGuard guarantees that no sensitive intermediate results are leaked in the training and inference. The efficiency advantage of NodeGuard is achieved by applying a novel keyed bucket aggregation protocol, which optimizes the communication and computation complexity globally in the training. Additionally, we introduce a probabilistic approximate division protocol with an optimization for re-scaling, when the divisor is publicly known. Finally, we compare NodeGuard to state-of-the-art frameworks, and we show that NodeGuard is extremely efficient. It can improve the privacy preserving GBDT training performance by a factor of 5.0 to 131 in LAN and 2.7 to 457 in WAN.

Available format(s)
Cryptographic protocols
Publication info
Published elsewhere. DLSP 2024
MPCTwo-party computationGradient boosting decision treeSecure bucket aggregation
Contact author(s)
yufan jiang @ partner kit edu
yong li1 @ huawei com
2024-04-06: approved
2024-04-05: received
See all versions
Short URL
Creative Commons Attribution


      author = {Tianxiang Dai and Yufan Jiang and Yong Li and Fei Mei},
      title = {{NodeGuard}: A Highly Efficient Two-Party Computation Framework for Training Large-Scale Gradient Boosting Decision Tree},
      howpublished = {Cryptology ePrint Archive, Paper 2024/535},
      year = {2024},
      note = {\url{}},
      url = {}
Note: In order to protect the privacy of readers, does not use cookies or embedded third party content.