In this work, we develop powerful information theoretic tools to prove lower bounds on the communication complexity of MPC. We restrict ourselves to a concrete setting involving 3-parties, in order to bring out the power of these tools without introducing too many complications. Our techniques include the use of a data processing inequality for {\em residual information} --- i.e., the gap between mutual information and Gács-K\"orner common information, a new {\em information inequality} for 3-party protocols, and the idea of {\em distribution switching} by which lower bounds computed under certain worst-case scenarios can be shown to apply for the general case.
Using these techniques we obtain tight bounds on communication complexity by MPC protocols for various interesting functions. In particular, we show concrete functions that have ``communication-ideal'' protocols, which achieve the minimum communication simultaneously on all links in the network. Also, we obtain the first {\em explicit} example of a function that incurs a higher communication cost than the input length in the secure computation model of Feige, Kilian and Naor \cite{FeigeKiNa94}, who had shown that such functions exist. We also show that our communication bounds imply tight lower bounds on the amount of randomness required by MPC protocols for many interesting functions.
Category / Keywords: foundations / Secure Multi-party Computation, Information Theory, Communication Complexity, Randomness Complexity Original Publication (in the same form): IACR-CRYPTO-2014 Date: received 27 Apr 2015 Contact author: deepesh data at gmail com Available format(s): PDF | BibTeX Citation Version: 20150429:130802 (All versions of this report) Short URL: ia.cr/2015/391 Discussion forum: Show discussion | Start new discussion