Blog

Giving data back to its users, MIT Media Lab creates the Personal Data Store

29 Sep , 2014  | by:

MIT Media Lab researchers have developed a new program openPDS/SafeAnswers in which the starting point is that users collects their own data, and apps can only ask questions about this data, not get access to the data itself. A great way to protect privacy, if adopted generally.

The core of the program is the Personal Data Store, which in the words of the researchers is a:

(…)a field-tested personal data store (PDS) allowing users to collect, store, and give fine-grained access to their metadata to third parties.

Thus, the user themselves collects the data in the PDS, and apps or websites which require certain information can ask for specific information from the user’s PDS, rather than do the collecting themselves.

openPDS can be installed on any server under the control of the individual (personal server, virtual machine, etc) or can be provided as a service (SaaS by independent software vendors or application service providers). (…)

Under the openPDS/SafeAnswers mechanism, a piece of code would be installed inside the user’s PDS. The installed code would use the sensitive raw metadata (such as raw accelerometers readings or GPS coordinates) to compute the relevant piece of information within the safe environment of the PDS. In practice, researchers and applications submit code (the question) to be run against the metadata, and only the result (the answer) is sent back to them.

(…) [T]his simple idea allows individuals to fully use their data without having to share the raw data.

For example, an app that send you reminder when you are at home to take out the trash would no longer be required to collect all your location data in order to pin-point that you are ‘home’ at a certain point. Rather, this (sensitive) location data is kept by the user and the app asks the openPDS system via SafeAnswers if you are in fact home. The app then gets the information without a log of all your location data. From the website of openPDS:

SafeAnswers allows applications to ask questions that will be answer [sic] using the user’s personal data. In practice, applications will send code to be run against the data and the answer will be send back to them. openPDS ships code, not data. openPDS turns a very hard anonymization problem to an easier security problem.

Indeed, what the researchers have understood is that anonymization of data is very difficult to say the least, as we have already shown here as well. If you are interested in reading more about this topic, I can recommend the readings of . What the researchers of the MIT Media Lab have therefore designed is a workaround; rather than give the raw data, or meta data, in anonymised form to a third party, they have opted to give some control back to the user by allowing him or her to share very specific personal meta data.

This idea of the user deciding how much privacy it wishes to share in return for a particular service, as micro as on an app or individual website level, is familiar to ‘the privacy butler’ proposed by Lawrence Lessig in ‘Code and Other Laws of Cyberspace’ (1999). In it, he proposes to develop software which negotiates the privacy of the user:

The user sets her preferences once—specifies how she would negotiate privacy and what she is willing to give up—and from that moment on, when she enters a site, the site and her machine negotiate. Only if the machines can agree will the site be able to obtain her personal data.

Of course, this electronic negotiator is not entirely created by what openPDS/SafeAnswers does, but both stem from the idea that the user should be more in control of access to their information. Whereas Lessig envisaged a negotiating Cyber-Jeeves, the MIT Media Lab created something akin to an electronic gatekeeper, which supplies the bare necessities in terms of information and does not negotiate.

The biggest caveat to the project is its implementation. over at Co.Exist mentions that indeed MIT is not the first to try to give people more power over their data, but these initiatives had stranded because the large scale implementation never took off. And that is exactly where the MIT project’s achilles heel is located. The implementation of the openPDS system requires that all current apps redesign their own software; as they are not designed to communicate with SafeAnswers, but rather to deal with the raw data itself. If we can get app developers to create PDS-aware versions of an app, then we have truly made some great steps in the right the direction. App developers might be inclined to switch to creating PDS-aware versions of an app as they no longer have to code specific parts to handle raw data, but rather only specific questions for SafeAnswers framework. However, at the same time the openPDS system may undermine existing business models that are based on the (broad) collection of personal (meta) data.

 


Leave a Reply

Comments RSS Feed