Paper 2020/1130

Secure training of decision trees with continuous attributes

Mark Abspoel, Daniel Escudero, and Nikolaj Volgushev


We apply multiparty computation (MPC) techniques to show, given a database that is secret-shared among multiple mutually distrustful parties, how the parties may obliviously construct a decision tree based on the secret data. We consider data with continuous attributes (i.e., coming from a large domain), and develop a secure version of a learning algorithm similar to the C4.5 or CART algorithms. Previous MPC-based work only focused on decision tree learning with discrete attributes (De Hoogh et al. 2014). Our starting point is to apply an existing generic MPC protocol to a standard decision tree learning algorithm, which we then optimize in several ways. We exploit the fact that even if we allow the data to have continuous values, which a priori might require fixed or floating point representations, the output of the tree learning algorithm only depends on the relative ordering of the data. By obliviously sorting the data we reduce the number of comparisons needed per node to $O(N \log^2 N)$ from the naive $O(N^2)$, where $N$ is the number of training records in the dataset, thus making the algorithm feasible for larger datasets. This does however introduce a problem when duplicate values occur in the dataset, but we manage to overcome this problem with a relatively cheap subprotocol. We show a procedure to convert a sorting network into a permutation network of smaller complexity, resulting in a round complexity of $O(\log N)$ per layer in the tree. We implement our algorithm in the MP-SPDZ framework and benchmark our implementation for both passive and active three-party computation using arithmetic modulo $2^{64}$. We apply our implementation to a large scale medical dataset of $\approx 290\,000$ rows using random forests, and thus demonstrate practical feasibility of using MPC for privacy-preserving machine learning based on decision trees for large datasets.

Note: Fixed acknowledgements

Available format(s)
Cryptographic protocols
Publication info
Published elsewhere. Proceedings on Privacy Enhancing Technologies (PoPETs) 2021, Issue 1
decision treesmultiparty computationprivacy-preserving machine learning
Contact author(s)
abspoel @ cwi nl
escudero @ cs au dk
2021-06-20: revised
2020-09-21: received
See all versions
Short URL
Creative Commons Attribution


      author = {Mark Abspoel and Daniel Escudero and Nikolaj Volgushev},
      title = {Secure training of decision trees with continuous attributes},
      howpublished = {Cryptology ePrint Archive, Paper 2020/1130},
      year = {2020},
      note = {\url{}},
      url = {}
Note: In order to protect the privacy of readers, does not use cookies or embedded third party content.