Can We Do Anything About Sexual Crimes in the Metaverse?

Junmo Yoon

Associate Editor

Loyola University Chicago School of Law, JD 2024

* Content warning: This piece contains references to sexual assault and harassment.

In an article entitled Reality or Fiction, Nina Jane Patel shared her experience with sexual harassment in the Metaverse. She repeatedly asked fellow users to stop and tried to move away, but they followed her, continuing their verbal assault and sexual advances. In part, she writes, “they touched and groped while they took selfies. They were laughing, they were aggressive, and relentless. I froze. It was a nightmare.” As she tried to escape the situation, she could still hear them – “don’t pretend you didn’t love it, this is why you came here.” After the assault, she couldn’t report it to the police, and no suit was filed against the group of four men.

This is because the assault happened in the Metaverse. All Ms. Patel had to do was take off her Oculus headset. All she could do was take off the VR headset.

Not just another game

In simple terms, the Metaverse, launched in October of 2021, three-dimensionalizes the online space. Users are connected and immersed into the world inside the screen, using augmented reality (AR) or virtual reality (VR) gears, interacting with the massive network in the virtual domain as one would with the real world. The reach and use of the Metaverse is far-reaching and much of it, definitionally, is yet to be defined. While the evolution continues, a few fundamental features include the AR and VR technology to simulate the real world and the digital avatars that represent human users.

However, the Metaverse is not just another game. Unquestionably, the creators intended to mimic and mirror the real and physical world. Users can hang out with friends, attend meetings and conferences, go to live music events, and even buy land and NFT artworks in their bedrooms.

It is common to stumble upon video clips of people with AR/VR gears screaming in an empty room or tripping over air. The users obviously know that what they are seeing is a simulated universe with fake zombies and fake cliffs. Even in its early stages, the current immersive features and haptic technologies of AR/VR devices make it possible to trick the brain and senses into believing that the experience is very much real.

Boundaries blurred

Nina Jane Patel, as part of her research, had created a profile in the Metaverse in December of 2021. Within 60 seconds of joining, she was verbally and sexually harassed by three to four men. When her female avatar was touched in the game, she felt her handheld controllers vibrate. The panoramic view, audio, and the touch simulation provided by VR headsets and handheld controls create a multisensory experience, blurring the separation between the virtual and the physical. The immersive element and haptic feature meant to enhance user experience made the experience more daunting and traumatizing for Ms. Patel.

It doesn’t end when users take off the headset. Victims undergo the same anxiety, panic attacks, and possibly depression that real-life victims experience. Since revealing her virtual sexual assault, Ms. Patel has received death and rape threats, this time offline, and threats against her daughters, illustrating the real-world impact online harassment can bring.

The boundaries between the virtual and the real world is more and more blurred. The forms of violence and crimes are not distinct nor separate but are interconnected. They further influence other users in the Metaverse and lead to another. We must recognize the continuous nature of violence in the virtual space. We know violence traverse spaces. Both the platform owners and legislature need to protect victims of violence through stricter guidelines and real regulations.

Meta is technically doing nothing

At a shareholder meeting this May, a proposal by investors and advocacy groups demanding Meta to publish a report examining potential civil and human rights harms users could face in the Metaverse was voted down. Nick Clegg, President of Meta Platforms, added, “people shout and swear and do all kinds of unpleasant things that aren’t prohibited by law… Metaverse will be no different. People…will always find ways around it.” 

After numerous sexual harassment incidents, Meta added a ‘personal boundary’ option that creates an invisible bubble around the avatar. However, this option is only a temporary bandage on a bullet wound when it is evident that the decision makers of Meta refuse to recognize the issue and trivialize the user trauma from their Metaverse experiences.

Lack of regulations and laws against crimes

Regulating social media and communication over the internet is not a new concept. Cyberbullying and cyber assault has been around for many years with the robust growth of social media. However, while virtual worlds are being built to mirror the physical world, cyber laws that find perpetrators accountable lagged behind.

Recent laws enacted to regulate social media may also apply to the Metaverse. The European Union’s newly rolled out Digital Services Act, which fines up to 6% of the company’s global revenue (about $7 billion for Meta) if it fails to monitor and remove hate or unsafe contents. A similar regulation could help develop new safety guidelines in the Metaverse. Moreover, UK’s data watchdog, the Information Commissioner’s Office, sought talks with Facebook earlier this year regarding the lack of parental controls on its Oculus headsets.

In contrast, US tech companies do not face the similar accountability. They are practically immune to legal liability because laws simply do not exist. We are still very early into the Meta-world era. Lawmakers should learn about the new industry and proactively pass laws that protects users from injury in the rapidly growing virtual world.

It is naive to think that the darkest parts of the physical world will not mirror in the virtual world. The impacts are bigger than just breaking a company’s terms of service, and the consequences should be bigger than just being banned from the platform. The platforms and legislature must share the legal burden if things go wrong. If avatars are representative of real-world people, real-world safety measures should be implemented to protect every user.