You are looking at a specific version 20170310:131434 of this paper. See the latest version.

Paper 2016/892

Privacy-Preserving Distributed Linear Regression on High-Dimensional Data

Adrià Gascón and Phillipp Schoppmann and Borja Balle and Mariana Raykova and Jack Doerner and Samee Zahur and David Evans

Abstract

We propose privacy-preserving protocols for computing linear regression models, in the setting where the training dataset is vertically distributed among several parties. Our main contribution is a hybrid multi-party computation protocol that combines Yao's garbled circuits with tailored protocols for computing inner products. Like many machine learning tasks, building a linear regression model involves solving a system of linear equations. We conduct a comprehensive evaluation and comparison of different techniques for securely performing this task, including a new Conjugate Gradient Descent (CGD) algorithm. This algorithm is suitable for secure computation because it uses an efficient fixed-point representation of real numbers while maintaining accuracy and convergence rates comparable to what can be obtained with a classical solution using floating point numbers. Our technique improves on Nikolaenko et al.'s method for privacy-preserving ridge regression (S&P 2013), and can be used as a building block in other analyses. We implement a complete system and demonstrate that our approach is highly scalable, solving data analysis problems with one million records and one hundred features in less than one hour of total running time.

Metadata
Available format(s)
PDF
Category
Cryptographic protocols
Publication info
Preprint. MINOR revision.
Keywords
multi-party computationgarbled circuitslinear regression
Contact author(s)
agascon @ inf ed ac uk
History
2017-10-17: last of 4 revisions
2016-09-14: received
See all versions
Short URL
https://ia.cr/2016/892
License
Creative Commons Attribution
CC BY
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.