Paper 2024/1062

Compact Key Function Secret Sharing with Non-linear Decoder

Chandan Kumar, IIT Kharagpur, India
Sikhar Patranabis, IBM Research, India
Debdeep Mukhopadhyay, IIT Kharagpur, India
Abstract

We present a variant of Function Secret Sharing (FSS) schemes tailored for point, comparison, and interval functions, featuring compact key sizes at the expense of additional comparison. While existing FSS constructions are primarily geared towards $2$-party scenarios, exceptions such as the work by Boyle et al. (Eurocrypt 2015) and Riposte (S&P 2015) have introduced FSS schemes for $p$-party scenarios ($p \geq 3$). This paper aims to achieve the most compact $p$-party FSS key size to date. We achieve a noteworthy reduction in key size, a $2^p$-factor decrease compared to state-of-the-art FSS constructions (including computationally efficient constructions using symmetric-key primitives) of distributed point function (DPF). Compared to the previous public-key-based FSS design for DPF, we also get a key size reduction equal to a $2^{n/2}$-sized row vector, where $2^n$ is the domain size of the point function. This reduction in key size comes at the cost of a required comparison operation by the decoder (hence called a non-linear decoder), a departure from prior schemes. In $p$-party scenarios, our construction outperforms existing FSS constructions in key size, remaining on par with Riposte in evaluation time and showing significant improvement over Boyle et al. In addition to constructing FSS for distributed point functions (DPF), we extend our approach to distributed comparison and interval functions, achieving the most efficient key size to date. Our distributed comparison function exhibits a key-size reduction by a factor of $q^{p-1}$, where $q$ denotes the size of the algebraic group used in the scheme's construction. The reduced key size of the comparison function has practical implications, particularly in applications like privacy-preserving machine learning (PPML), where thousands of comparison functions are employed in each neural network layer. To demonstrate the effectiveness of our improvements, we design and prototype-implement a scalable privacy-preserving framework for neural networks over distributed models. Specifically, we implement a distributed rectified linear unit (ReLU) activation function using our distributed comparison function, showcasing the efficacy of our proposed scheme.

Metadata
Available format(s)
PDF
Category
Foundations
Publication info
Published by the IACR in CIC 2024
Keywords
Function Secret SharingDistributed Comparison FunctionReLU
Contact author(s)
cchaudhary278 @ gmail com
sikharpatranabis @ gmail com
debdeep mukhopadhyay @ gmail com
History
2024-06-30: approved
2024-06-29: received
See all versions
Short URL
https://ia.cr/2024/1062
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2024/1062,
      author = {Chandan Kumar and Sikhar Patranabis and Debdeep Mukhopadhyay},
      title = {Compact Key Function Secret Sharing with Non-linear Decoder},
      howpublished = {Cryptology {ePrint} Archive, Paper 2024/1062},
      year = {2024},
      url = {https://eprint.iacr.org/2024/1062}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.