Paper 2015/386

Privately Evaluating Decision Trees and Random Forests

David J. Wu, Tony Feng, Michael Naehrig, and Kristin Lauter


Decision trees and random forests are common classifiers with widespread use. In this paper, we develop two protocols for privately evaluating decision trees and random forests. We operate in the standard two-party setting where the server holds a model (either a tree or a forest), and the client holds an input (a feature vector). At the conclusion of the protocol, the client learns only the model's output on its input and a few generic parameters concerning the model; the server learns nothing. The first protocol we develop provides security against semi-honest adversaries. We then give an extension of the semi-honest protocol that is robust against malicious adversaries. We implement both protocols and show that both variants are able to process trees with several hundred decision nodes in just a few seconds and a modest amount of bandwidth. Compared to previous semi-honest protocols for private decision tree evaluation, we demonstrate a tenfold improvement in computation and bandwidth.

Note: Extended version of paper appearing in PETS 2016.

Available format(s)
Cryptographic protocols
Publication info
Published elsewhere. MAJOR revision.PETS 2016
public-key cryptographyapplicationsmachine learning
Contact author(s)
dwu4 @ cs stanford edu
2016-05-26: revised
2015-04-29: received
See all versions
Short URL
Creative Commons Attribution


      author = {David J.  Wu and Tony Feng and Michael Naehrig and Kristin Lauter},
      title = {Privately Evaluating Decision Trees and Random Forests},
      howpublished = {Cryptology ePrint Archive, Paper 2015/386},
      year = {2015},
      doi = {popets-2016-0043},
      note = {\url{}},
      url = {}
Note: In order to protect the privacy of readers, does not use cookies or embedded third party content.