Skip to content

Commit

Permalink
Add references
Browse files Browse the repository at this point in the history
  • Loading branch information
csharrison authored and cjpatton committed Aug 2, 2021
1 parent 78e4dbb commit 858a8c4
Showing 1 changed file with 22 additions and 8 deletions.
30 changes: 22 additions & 8 deletions draft-pda-protocol.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,20 @@ informative:
-ins: N. Gilboa
-ins: Y. Ishai

JD02:
title: "The Sybil Attack"
date: 2022-10-10
target: "https://link.springer.com/chapter/10.1007/3-540-45748-8_24"
author:
-ins: J. Douceur

SV16:
title: "The Complexity of Differential Privacy"
date: 2016-08-09
target: "https://privacytools.seas.harvard.edu/files/privacytools/files/complexityprivacy_1.pdf"
author:
-ins: S. Vadhan

normative:

FIPS180-4:
Expand Down Expand Up @@ -1230,6 +1244,7 @@ mitigations available to aggregators also apply to the leader.
1. Known input injection. Collectors may collude with clients to send known
input to the aggregators, allowing collectors to shrink the effective
anonymity set by subtracting the known inputs from the final output.
Sybil attacks {{JD02}} could be used to amplify this capability.

#### Mitigations

Expand Down Expand Up @@ -1315,14 +1330,13 @@ choose minimum batch sizes.
## Differential privacy {#dp}

Optionally, PDA deployments can choose to ensure their output F achieves
[differential privacy](https://en.wikipedia.org/wiki/Differential_privacy).
A simple approach would require the aggregators to add two-sided
noise (e.g. sampled from a two-sided geometric distribution) to outputs.
Since each aggregator is adding noise independently, privacy can be guaranteed
even if all but one of the aggregators is malicious. Differential privacy is a strong
privacy definition, and protects users in extreme circumstances: Even if an
adversary has prior knowledge of every input in a batch except for one, that
one record is still protected.
differential privacy {{SV16}}. A simple approach would require the aggregators
to add two-sided noise (e.g. sampled from a two-sided geometric distribution)
to outputs. Since each aggregator is adding noise independently, privacy can be
guaranteed even if all but one of the aggregators is malicious. Differential
privacy is a strong privacy definition, and protects users in extreme
circumstances: Even if an adversary has prior knowledge of every input in a
batch except for one, that one record is still protected.

## Multiple protocol runs

Expand Down

0 comments on commit 858a8c4

Please sign in to comment.