Paper 2024/1077

Securely Training Decision Trees Efficiently

Divyanshu Bhardwaj, Microsoft Research (India)
Sandhya Saravanan, Microsoft Research (India)
Nishanth Chandran, Microsoft Research (India)
Divya Gupta, Microsoft Research (India)
Abstract

Decision trees are an important class of supervised learning algorithms. When multiple entities contribute data to train a decision tree (e.g. for fraud detection in the financial sector), data privacy concerns necessitate the use of a privacy-enhancing technology such as secure multi-party computation (MPC) in order to secure the underlying training data. Prior state-of-the-art (Hamada et al.) construct an MPC protocol for decision tree training with a communication of $\mathcal{O}(hmN\log N)$, when building a decision tree of height $h$ for a training dataset of $N$ samples, each having $m$ attributes. In this work, we significantly reduce the communication complexity of secure decision tree training. We construct a protocol with communication complexity $\mathcal{O}(mN\log N + hmN + hN\log N)$, thereby achieving an improvement of $\approx \mathsf{min}(h, m, \log N)$ over Hamada et al. At the core of our technique is an improved protocol to regroup sorted private elements further into additional groups (according to a flag vector) while maintaining their relative ordering. We implement our protocol in the MP-SPDZ framework and show that it requires $10\times$ lesser communication and is $9\times$ faster than the state-of-the-art.

Metadata
Available format(s)
PDF
Category
Cryptographic protocols
Publication info
Published elsewhere. Minor revision. ACM SIGSAC Conference on Computer and Communications Security 2024
DOI
10.1145/3658644.3670268
Keywords
Privacy Preserving Machine LearningDecision Tree Training
Contact author(s)
dbhardwaj @ ucsb edu
t-ssaravanan @ microsoft com
nichandr @ microsoft com
divya gupta @ microsoft com
History
2024-07-02: revised
2024-07-02: received
See all versions
Short URL
https://ia.cr/2024/1077
License
Creative Commons Attribution-NonCommercial
CC BY-NC

BibTeX

@misc{cryptoeprint:2024/1077,
      author = {Divyanshu Bhardwaj and Sandhya Saravanan and Nishanth Chandran and Divya Gupta},
      title = {Securely Training Decision Trees Efficiently},
      howpublished = {Cryptology ePrint Archive, Paper 2024/1077},
      year = {2024},
      doi = {10.1145/3658644.3670268},
      note = {\url{https://eprint.iacr.org/2024/1077}},
      url = {https://eprint.iacr.org/2024/1077}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.