Paper 2019/338

Garbled Neural Networks are Practical

Marshall Ball, Brent Carmer, Tal Malkin, Mike Rosulek, and Nichole Schimanski


We show that garbled circuits are a practical choice for secure evaluation of neural network classifiers. At the protocol level, we start with the garbling scheme of Ball, Malkin & Rosulek (ACM CCS 2016) for arithmetic circuits and introduce new optimizations for modern neural network activation functions. We develop fancy-garbling, the first implementation of the BMR16 garbling scheme along with our new optimizations, as part of heavily optimized garbled-circuits tool that is driven by a TensorFlow classifier description. We evaluate our constructions on a wide range of neural networks. We find that our approach is up to 100x more efficient than straight-forward boolean garbling (depending on the neural network). Our approach is also roughly 40% more efficient than DeepSecure (Rouhani et al., DAC 2018), the only previous garbled-circuit-based approach for secure neural network evaluation, which incorporates significant optimization techniques for boolean circuits. Furthermore, our approach is competitive with other non-garbled-circuit approaches for secure neural network evaluation.

Available format(s)
Publication info
Preprint. MINOR revision.
garbled circuitsneural networks
Contact author(s)
bcarmer @ gmail com
2019-06-24: last of 2 revisions
2019-04-03: received
See all versions
Short URL
Creative Commons Attribution


      author = {Marshall Ball and Brent Carmer and Tal Malkin and Mike Rosulek and Nichole Schimanski},
      title = {Garbled Neural Networks are Practical},
      howpublished = {Cryptology ePrint Archive, Paper 2019/338},
      year = {2019},
      note = {\url{}},
      url = {}
Note: In order to protect the privacy of readers, does not use cookies or embedded third party content.