Privacy in the Metaverse is Impossible?

New research suggests that immersive experiences will also be very privacy-unfriendly...

metaverse

A new paper from the University of California Berkeley suggests that privacy may be impossible in the metaverse.

The study was led by graduate researcher Vivek Nair at the Center for Responsible Decentralized Intelligence (RDI). It involved the largest dataset of user interactions in virtual reality (VR) that has ever been analyzed for privacy risks.

The surprising revelation is how little data is needed to uniquely identify a user in the metaverse, potentially eliminating any chance of true anonymity in virtual worlds.

Simple motion data may be enough

Typically, researchers and policymakers who study metaverse privacy focus on the many cameras and microphones in VR headsets that capture detailed information about the user’s facial features, vocal qualities, and eye motions — as well as the ambient information about the user’s surroundings. Moreover, novel technologies like EEG sensors are also able to detect unique brain activity through the scalp.

Now, you may think that turning off some of these sensors will do the trick, but you would be wrong. That’s because the most basic data stream needed to interact with a virtual world — i.e., simple motion data (telemetry data) — may be all that’s required to identify a user within a large population uniquely.

Identification in seconds

The mentioned Berkeley study, entitled “Unique Identification of 50,000-plus Virtual Reality Users from Head and Hand Motion Data,” analyzed more than 2.5 million fully anonymized VR data from more than 50,000 players of the popular Beat Saber app and found that individual users could be uniquely identified with more than 94% accuracy using only 100 seconds of motion data. What’s more, half of all users could be uniquely identified with only 2 seconds of motion data.

To put it differently, any time a user puts on a mixed reality headset grabs the two standard hand controllers, and begins interacting in a virtual world – they are leaving behind a trail of digital fingerprints that can uniquely identify them. Heck, the study suggests that when a VR user swings a virtual saber at an object flying towards them, the motion data they leave behind may be more uniquely identifiable than their actual real-world fingerprint. Plus, the same data could be used to accurately infer a number of specific personal characteristics about users, including their height, handedness and gender.

That is a scary thought when you think about it.

Eliminating anonymity

The problem with the research findings is that the motion data is fundamental to the metaverse, though it poses a serious privacy risk — potentially eliminating anonymity in the metaverse.

In that sense, Nair describes moving around in a virtual world while streaming basic motion data is “like browsing the internet while sharing your fingerprints with every website you visit.” He added that “unlike web-browsing, which does not require anyone to share their fingerprints, the streaming of motion data is a fundamental part of how the metaverse currently works.”

The Berkeley study suggests that common motions in the metaverse could be as unique to each of us as fingerprints. If that’s the case, these so-called “motion prints” would mean that casual shoppers wouldn’t be able to visit a virtual store without being uniquely identifiable.

Can this privacy problem be solved?

One approach could be to obscure the motion data before streaming it from the user’s hardware to any external servers. However, this means introducing noise and could lead to a slower (and worse) experience for the users. For most users, it may not be worth the tradeoff.

Another approach is to enact sensible regulations preventing metaverse platforms from storing and analyzing human motion data. The problem could appear in enforcing such regulation and the fact that the industry would (obviously) push back against such moves. Also, any regulation would have to be introduced country-by-country, with platform operators taking advantage of legal loopholes in places where that’s possible.

Researchers at Berkeley are exploring sophisticated defensive techniques that they hope will obscure the unique characteristics of physical motions without degrading dexterity in virtual and augmented worlds.

We can only hope they will manage to come up with technology that will work, though, in the meantime, we are looking forward to seeing some regulation being introduced in at least some parts of the world. For instance, in the EU, we could imagine the GDPR being extended to include metaverse-specific scenarios. Also, something similar “happens” in California and also in other places…

We will make sure to keep up with the “privacy in the metaverse” story and get back to you as soon as we have something to add. Stay tuned in the meantime…