The -nearest neighbors (NN) classifier predicts a class of a query, , by taking the majority class of its neighbors in an existing (already classified) database, .
In secure NN, and are owned by two different parties and is classified without sharing data.
In this work we present a classifier based on NN, that is more efficient to implement with homomorphic encryption (HE).
The efficiency of our classifier comes from a relaxation we make to consider nearest neighbors for
with probability that increases as the statistical distance between Gaussian and the distribution of the distances from to decreases.
We call our classifier -ish Nearest Neighbors (-ish NN).
For the implementation
we introduce {\em double-blinded coin-toss} where
the bias and output of the toss are encrypted.
We use it to approximate the average and variance of the distances from to
in a scalable circuit whose depth is independent of .
We believe these to be of independent interest.
We implemented our classifier in an open source library based on HElib and tested it on a breast tumor database.
Our classifier has accuracy and running time comparable to current state of the art (non-HE) MPC solution that have better running time but worse communication complexity.
It also has communication complexity similar to naive HE implementation that have worse running time.