Earlier this week, W3C held a workshop on privacy and data usage control. Among the submitted position papers are quite a few interesting thoughts, and though I couldn’t attend the workshop, it will be good to see the eventual report from it.
I did manage to submit a paper that explores the contributions of User-Managed Access (UMA) to letting people control the usage of their personal data. It was a chance to capture an important part of the philosophy we bring to our work, and the challenges that remain. From the paper’s introduction:
…UMA allows a user to make demands of the requesting side in order to test their suitability for receiving authorization. These demands can include requests for information (such as “Who are you?†or “Are you over 18?â€) and promises (such as “Do you agree to these non-disclosure terms?†or “Can you confirm that your privacy and data portability policies match my requirements?â€).
The implications of these demands quickly go beyond cryptography and web protocols and into the realm of agreements and liability. UMA values end-user convenience, development simplicity, and web-wide adoption, and therefore it eschews such techniques as DRM. Instead, it puts a premium on user visibility into and control over access criteria and the authorization lifecycle. UMA also seeks at least a minimum level of enforceability of authorization agreements, in order to make the act of granting resource access truly informed, uncoerced, and meaningful. Granting access to data is then no longer a matter of mere passive consent to terms of use. Rather, it becomes a valuable offer of access on user-specified terms, more fully empowering ordinary web users to act as peers in a network that enables selective sharing.
Some of the challenges are technical, some legal, and some related to business incentives. The paper approaches the discussion with what I hope is a sense of realism, along with some justified optimism about near-term possibilities.
(Speaking of which, I like the realism pervading Ben Laurie’s recent criticism of the EFF’s suggested bill of privacy rights for social network users. He cautions them to stay away from implicitly mandating mechanisms like DRM — and, in focusing on broader aims, to be careful what they wish for.)
If you’re so inclined, I hope you’ll check out the paper and the other workshop inputs and outputs.