It should be assumed that all data provided to third parties could possibly be hacked and potentially leak out onto the dark web. So when I expressed concern to Cohen that de-identified data being collected could be unlocked with the right biometric key his response was that you’d need to have access to the full set of data, and that this data is stored securely on their private servers. But information could have the potential to be hacked and leaked, and there could be a lot of unintended consequences of allowing biometric data to be captured and recorded in what is presumed to be a safe vault, but could find its way into the wrong hands.
Cohen’s response to my concern implies that data are completely safe in their hands, and that we shouldn’t worry about this scenario. Perhaps it’s low probability, but I’d argue that we should be thinking about the real risk that decades worth of biometric data could eventually be leaked out onto the dark web, unlocked with biometric signatures, and what could happen if a bad actor wanted to manipulate us if they had access to the most intimate data about our unconscious behaviors, values, and beliefs. Engineering the future depends upon all sorts of risks and tradeoffs, and it may turn out that some of these dystopian worst-case scenarios are so low risk as to not to worry about them. But perhaps we should be imagining these worst-base scenarios in order to think deeply about the risks of what data is being collected, and whether or not biometric data will be able to be fully de-identifiable.
So overall, the impression that I got from Hall and Cohen is that Oculus is earnestly trying to be on the right side of the transparency, and they’re trying to really build trust with users in order to grow the VR and AR ecosystem. The problem that I have is that there is still a lack of full transparency and communication about the types of data that are collected and how it’s stored, but also what types of data may prove interesting and valuable for Facebook to use for advertising purposes.
The line between Oculus and Facebook continues to blur, and so I can’t help but to read the privacy policy with a lens of the worst-case scenario of how Facebook might want to gather biometric data about people to feed into their advertising systems. Oculus provided a lot of transparency with the data that are being collected, and hopefully their My Privacy Tool will help with that. But there are entire classes of data, and the specifics of how the data are captured and stored that are completely opaque. And on top of that, there are no obligations for notification or disclosure that they’re writing into their privacy policy, and so whatever is happening today doesn’t mean that this is what will be true tomorrow.
Just because Oculus isn’t living into the full extent of what their privacy policy affords, it’s written open-ended enough for them to grow into it and create new products that weren’t even imagined or implemented at the time of the writing. This allows them flexibility, but this also means that there are many passages in their privacy policy that are written in such an open-ended and vague way as to be possibly interpreted to mean a lot of scary things. Hall claimed that the new privacy policy isn’t trying to gain new rights, but the passage of “We collect information about the people, content, and experiences you connect to and how you interact with them across our Services” could open the door to allow Oculus to more precisely track how you interact with specific content within a VR experience.
Both Hall & Cohen emphasized that they’re taking the most conservative interpretations of these types of passages, and that they’re trying to build trust with users, and that their new privacy tools will be providing new levels of transparency and accountability. A lot of these tools seem to be implemented as compelled by the new GDPR laws, and an open question is whether it requires these types of laws encourage Oculus to continue to implement privacy best practices or whether or not they’ll continue to go above and beyond what these policies require and start to provide even more details and information on what exactly is being recorded and tied to identity, what’s being recorded as de-identified information, and what’s stored locally on your computer.
I’m also happy to start a deeper dialogue with people who are directly on the Privacy XFN team at Facebook/Oculus who are starting to think about these deeper issues about privacy in VR and AR, and some of the privacy challenges that come with biometric data. It’s been difficult to have an embodied conversation with privacy experts at Facebook or Google, and I’m glad that the cultural conversation has changed to the point where I’m able to have an in-depth conversation about these topics. And hopefully this marks a change in how Oculus is engaging with press after not taking any press interviews at either Oculus Connect 4 or GDC 2018.
I was happy to hear how much consideration is being taken about how this data are being collected from this conversation, and I hope that Oculus finds some better ways to share this type of information in a more comprehensive and up-to-date fashion. The GDPR catalyzed a lot of great progress here, and I hope that Oculus doesn’t wait for more laws and regulations to keep on improving and updating their privacy practices.
Support Voices of VR
- Subscribe on iTunes
- Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip