-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Privacy Considerations for service endpoints #324
Comments
This is what PR #232 was attempting to address as well. I'd say in general that consensus of the WG is that people agree these concerns should be considered and addressed, and I am still a bit fuzzy about what language you'd like to add. Would you mind submitting a PR with proposed changes for this or some additional language here in the comments? |
The privacy considerations for DID are very well described in SIOP but I am inexperienced at writing spec text to capture them in a PR by myself. I could work together with someone familiar with both SIOP and SDS to strengthen this privacy section. |
ControlController Scenario 1Controller is the Subject "I create and control my own DID" Controller Scenario 2Controller is fiduciary to the Subject "Someone/thing I trust manages my DID" Controller Scenario 3Controller is hostile to the Subject "Someone who I don't trust manages a DID about me" DisclosureInformation in
|
@talltree I would love to hear your thoughts on how Privacy and Governance are related. |
@OR13 says:
and frames the issue in terms of information asymmetry. It's up to us in this WG, however we decide to handle the rubrics of decentralization, to make it clear that asking the subject for authorization to index is the same as asking for authorization to access. Let Moore's law intersecting the value of high quality personal information take care of the decreasing relative cost of this added friction. |
A survey for everyone, please: https://docs.google.com/forms/d/e/1FAIpQLSc8Z8FklORke1iPRoyo90GNWqqXkmdbgQLNvHvU-v4XvLxO0A/viewform?usp=sf_link Also, a draft PING document is now available and will be put on the agenda for broader discussion soon. |
@OR13 Whew, that is a seriously deep topic. I share your concern about publicly writable verifiable data registries (VDRs) and the potential that they could be: a) overwhelmed with spam DID documents, or b) suffer "GDPR attacks" by having DID documents with personal data written to them that can then mire the VDRs in GDPR erasure requests. (For a deep discussion of the latter, see the Sovrin Foundation white paper on Data Privacy Regulation and Distributed Ledger Technology.) I wish I had a magic wand to wave over these problems, but they are very real and to the best of my knowledge there is no magic bullet. So I think we just have to point out these challenges in our Privacy and Security Considerations sections. On the 2020-07-28 DID WG call, we also discussed the possibility of the WG authoring a separate note about these specific concerns, since they may be fundamental to broad adoption of DID infrastructure. Providing we do that after we go to CR, I'm willing to volunteer to help with that paper. |
There was extensive discussion on the DID WG public mailing list on this topic as well. It seems there's additional language that we can add at this point. Last I remember, @agropper was looking for additional language above and beyond #232 and would be willing to work with others to draft that language, but was hoping someone else could step in and help with submitting the PR to make the edit or to write the note as mentioned in the comment right above this. |
@talltree I took a scan of the Sovrin Foundation paper. It seems intended as a compliance aid rather than rubric for decentralization. Nothing wrong with that but it makes privacy analysis difficult and generally as ethically speculative as the pablum: "Privacy by Design". I hope we're designing DID Core not just for compliance with today's GDPR / PIPEDA / CCPA but for a world where individuals have a less asymmetric relationship with our benevolent platform operators. One way to do that is to design DID Core for Privacy by Default. One strategy for achieving Privacy by Default is to keep all data worth "registering" behind a mediator and/or an authorization server. I understand that this strict definition of DID Core might be unacceptable to operators of private DLTs and some federations. The question then becomes how do we define interoperability (across methods and federations) and can we word DID Core to achieve both Privacy by Default and satisfy the full range of methods and federations? |
We're getting ready to move into CR... which means this issue needs to result in proposed spec text and a PR very soon now, or it'll be deferred until a later specification. Someone needs to take an action to write a PR. |
I'll get some text proposed shortly. |
PR #616 has been merged, closing. |
Prior to PING and linked to our many layering discussions in SDS, our Privacy Considerations section and 10.4 Herd Privacy in particular might be reviewed.
5.7 says: "One of the primary purposes of a DID document is to enable discovery of service endpoints." The Privacy Considerations section does a good job of discussing correlation risks in general but is light on the risks and mitigations related to service endpoints.
As I understand it, the relationship between pseudonymous DIDs, herd immunity, and service endpoints implies that, in all "Privacy by Design" use-cases, the number of service endpoints SHOULD be one in order to reduce the risk of correlation and enhance the effectiveness of Herd Privacy.
Requesting Parties who discover a pseudonymous DID by whatever means, especially a search of "de-identified" metadata in various directories, SHOULD be directed to some kind of consent-based access control service with a minimum loss of entropy. This is especially important for DID methods that consider mitigation the GDPR Right to be Forgotten and in light of the fact that GDPR, CCPA, and HIPAA all allow unlimited use of de-identified data with little concern on enforcement of re-identification.
My suggestion would be to enhance the Privacy Considerations section with a discussion of how access control services could be proxied or otherwise "tumbled" by a herd privacy intermediary that could be chosen by the DID controller or data subject independently of the access control service or not. It would then be up to the DID controller to decide whether to use a dumb proxy mechanism like TOR or an access control service that provides herd privacy as the one service endpoint in a pseudonymous DID.
The text was updated successfully, but these errors were encountered: