Tombstone for every message could be hard, especially with garbage collection, or years-long users
Deletion of activity content associated with messages - we should require, but how to enforce?
GDPR implies there is some enforceable “deletion request” legally that must be respected for legal operation in EU
From a DSNP Spec perspective, we want to say a DSNP Spec-compliant implementation MUST implement a way to disconnect from current user state, remembering only that the DSNP ID used to exist (and forgetting all other information ?) e.g. for any batch row from this user, it should be ignored.
The Deletion request response must not invalidate an entire batch or other unrelated/adjacent data
We need to ensure there is no Personally Identifiable Information (PII) in batches, because these things must be removed under GDPR and similar laws
What’s in chain storage that is or references a user identity, currently:
Static ID
Graph
Delegations
Ideas:
A freeze state that allows no changes for a time period (deactivation period)
Does this mean no replies, reactions, mentions, etc? If so that can be done on chain by first removing all delegates for that static id
Is not sufficient for “deletion” since the related state can still be read
A special type of batch announcement for “forgetting” the user identity - presumably to trigger deletion of activity content
A chain call that returns all user’s on-chain data for the user to download before the data is removed
Other notes:
User leaving an app is different from this, that is just removing a delegation (de-authorization)
Chain state storage database does store state history, but it’s limited, it’s possible to go back to a certain point to retrieve state for when a given batch was published. History may not be complete. So it’s uncertain whether this would be compliant with GDPR to just delete the static id references in state.
Peter asked: Q: why use a chain if you can just delete the state, isn’t blockchain supposed to be immutable. Answered by Wil (summarized):
Blockchain ledgers are immutable, but current blockchain state is completely mutable.
Not all blockchains keep the data from historical ledgers, but someone always could
We always use current blockchain state to test validation instead of using historical ledger
Spec requirement should state that the user request for deletion must be valid under / is pertinent to “applicable law including GDPR”, so that there is a clear legal pressure to implement such a feature, so in that spec requirement it’s made clear that this is part of applicable laws that we don’t have control over.
What about 3rd parties that collect, index, cache the content? How does this affect user’s deletion rights with respect to DSNP, since we can’t control 3rd party behavior (we need legal guidance on this)? GDPR includes an understanding of processors, controllers, and rights & responsibilities that go with that, which doesn’t include 3rd parties outside our control (e.g. users taking screenshots or 3rd parties scraping data without permission).
DIP-149
Removes the Reaction message type from Tombstoning; instead of using a Tombstone message type to retract Reaction messages, they are “undone” in a simpler way, more like that of GraphChange “unfollow.”. Two potential solutions were proposed. The DIP-149 Issue was presented and briefly discussed. We voted to proceed with this DIP proposal and move discussion of the options to the Issue on GitHub.
I don’t think any user data should be stored on-chain, but on IPFS or something similar. On-chain data should only ever hold pointers to mutable data. Otherwise, it will be impossible to prevent CSAM proliferation.
One other way to ensure any service will be able to comply with GDPR is to encrypt any Personally identifiable information (PII) data (which obviously is going to be be stored off-chain) and for any requests to be forgotten from the user the service can remove those keys.
Agree with both @mikestaub and @aramik and “Right to be Forgotten (erasure)” is not absolute. Other conflicting legislation may exist that requires that PII is not erased but archived for future reference e.g. the history of ownership of land and property assets. Most Land Registries have legal requirements to maintain a history of authentic ownership i.e. the PII involved in property ownership cannot be forgotten. In terms of Social Networks though, I suspect that “Right to be Forgotten” will need to be offered as I’m not aware of any OTHER legislation that would prohibit this. There is no OTHER regulation of Social Platforms yet, that I am aware of, that would override data protection regulation (GDPR/CCPA++). Kind Regards, Patrick.
Thanks for pointing out this is not absolute. Another example is, at least some existing social network platforms will not “forget” someone or their content, whose content is unlawful or is evidence of a crime, particularly if requested for retention by law enforcement. Deletion of this content would itself be illegal in many jurisdictions.
You are very welcome @Shannon. All the contributions to this thread are in agreement that identity data and it’s lifecycle management is better off-chain leaving the blockchain to process a genuine user public key. You have raised a critical point about law enforcement difficulties with E2EE and securing evidence of unlawful, criminal, harmful content. Deletion or failure to disclose decrypted content is also a criminal offence in most jurisdictions. Law enforcement can use a court order / warrant to compel an organisation to provide either decrypted content or a decryption key. Apple, for example, do not hold a copy of users’ device private encryption key so can’t decrypt and don’t want to compromise user privacy (DoJ v Apple some years ago relating to a suicide bomber’s iPhone). For minimum impact on DSNP, it’s probably better that the DSNP can, under warrant to disclose (if that’s necessary for a public key), provide a user public key that points to a 3rd party identity manager for further disclosure about the real user’s identity. Law enforcement can then seek the private key from that user to enable decryption, assuming that the person is alive. Hope this helps and thanks for your point. Thoughts?
Thanks for that extra detail. I’m sure a lot of people don’t already know that stuff so it’s good to repeat it here, especially for context.
It seems many coming to this forum are unclear that individuals’ DSNP message data will not be stored on chain, with the exception of a pseudonymous graph. It wasn’t ever stored on chain, even in our Ethereum prototype (Edit: in fact the actual message content is two+ removes from the chain). In our current prototype, the graph and the pseudonymous identifier can be deleted entirely. The discussion for that week was about both on- and off-chain data – fundamentally, what tools we have to obey the law. We are extremely cognizant and very focused on the issues around CSAM content.
Then there is the argument about PII (Edit: by “argument” I mean formal argument, not altercation). There is legally-defined PII, and then one can have a pseudonymous digital fingerprint. By using the internet, we are all giving up some level of anonymity. If we want to participate fully in a social network, we will give up even more. People want to participate in social networks. We want to help them do that in a way that gives them power over their digital fingerprint, including the ability to remove it and say who can use it.
I very strongly encourage people to please read the DSNP Specification (linked in the original post) before making judgments about what we’re doing, preferably some of the other resources as well. This project has been around for over two years now, and has evolved and matured a lot over that time.
Also I want to emphasize that we welcome thoughtful, clear criticism. We read it all.
Thank you for this. Yes, there is as you say “legally-defined PII” (an off-chain data lifecycle) which can be associated with a pseudonymous identifier (key pair) that can be relied upon as genuine for on-chain use. “In our current prototype, the graph and the pseudonymous identifier can be deleted entirely.” would enable users to exercise their Right To Be Forgotten, themselves. For sure, the egregious CSAM will soon be a criminal offence (UK Online Safety Bill) and criminal liability for platforms that do not take it down: Draft Online Safety Bill - GOV.UK (www.gov.uk) California has also tabled similar legislation. Thanks again and regards, Patrick
@James, maybe lawful content and it’s moderation is a topic in it’s own right? given that regulation(s) will determine what is acceptable behaviour on all platforms. Just a thought, Kind Regards, Patrick