Paper 2021/432

XORBoost: Tree Boosting in the Multiparty Computation Setting

Kevin Deforth, Marc Desgroseilliers, Nicolas Gama, Mariya Georgieva, Dimitar Jetchev, and Marius Vuille

Abstract

We present a novel protocol XORBoost for both training gradient boosted tree models and for using these models for inference in the multiparty computation (MPC) setting. Similarly to [AEV20], our protocol supports training for generically split datasets (vertical and horizontal splitting, or combination of those) while keeping all the information about the features and thresholds associated with the nodes private, thus, having only the depths and the number of the binary trees as public parameters of the model. By using optimization techniques reducing the number of oblivious permutation evaluations as well as the quicksort and real number arithmetic algorithms from the recent Manticore MPC framework [CDG+21], we obtain a scalable implementation operating under information-theoretic security model in the honest-but-curious setting with a trusted dealer. On a training dataset of 25,000 samples and 300 features in the 2-player setting, we are able to train 10 regression trees of depth 4 in less than 1.5 minutes per tree (using histograms of 128 bins).

Metadata
Available format(s)
PDF
Category
Implementation
Publication info
Preprint. MINOR revision.
Keywords
Multiparty Computation ProtocolsEfficient implementationApplied cryptography
Contact author(s)
mariya @ inpher io
History
2021-12-03: last of 2 revisions
2021-04-06: received
See all versions
Short URL
https://ia.cr/2021/432
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2021/432,
      author = {Kevin Deforth and Marc Desgroseilliers and Nicolas Gama and Mariya Georgieva and Dimitar Jetchev and Marius Vuille},
      title = {{XORBoost}: Tree Boosting in the Multiparty Computation Setting},
      howpublished = {Cryptology {ePrint} Archive, Paper 2021/432},
      year = {2021},
      url = {https://eprint.iacr.org/2021/432}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.