Auditing Information Leakage for Distance Metrics

30 August 2011

Yikan Chen and I are releasing a paper today on Auditing Information Leakage for Distance Metrics. The paper is a first step towards the goal of developing self-auditing secure computations that can determine when the output of a secure computation would leak too much information to be safe to release. Yikan will present the paper at the Third IEEE Conference on Privacy, Security, Risk and Trust in Boston, 9-11 October 2011.

Abstract. Many useful scenarios involve allowing untrusted users to run queries against secret data, so long as the results do not leak too much information. This problem has been studied widely for statistical queries, but not for queries with more direct semantics. In this paper, we consider the problem of auditing queries where the result is a distance metric between the query input and some secret data. We develop an efficient technique for estimating a lower bound on the entropy remaining after a series of query-responses that applies to a class of distance functions including Hamming distance. We also present a technique for ensuring that no individual bits of the secret sequence is leaked. In this paper, we formalize the information leakage problem, describe our design for a query auditor, and report on experiments showing the feasibility and effectiveness of our approach for sensitive sequences up to thousands of bits.

Full paper: [PDF, 10 pages]