Metaverse risks and harms among US youth: Experiences, gender differences, and prevention and response measures

metaverse-teen-VR

We have a brand new piece just published in New Media and Society! We are excited about this contribution to the literature base based on a nationally-representative sample of 5,005 youth in the United States. Most importantly, we’ve distilled the findings into clear recommendations for platforms, parents and guardians, teen users, content creators, and schools. I’ve summarized the main takeaways below, and would love to discuss your thoughts and ideas for future research. The paper is available here; feel free to email us if you can’t access the journal article and we’ll send it your way.

To begin, it’s important to state that use of virtual reality (VR), the foundation for participation in metaverse spaces, is increasing among younger populations. In March 2024, Piper Sandler researchers found that 33% of teens own a VR device while 13% use it weekly. This can be compared to numbers from six months prior (September 2023), where ownership was 31% and weekly use was 10%. This slow but steady rise in metaverse and VR engagement among youth underscores the importance of investigating the potential risks and harms associated with these immersive environments, especially considering that the visceral nature of experiences in these spaces – which often involve personally curated avatars used as extensions of oneself – may amplify the impact of negative encounters.

Metaverse risks and harms among US youth: Experiences, gender differences, and prevention and response measures

Our study aimed to address four key research questions:

1) What is the extent of VR device use and participation in social metaverse environments among teens?
2) What types of harm are typically experienced by this population in these environments?
3) How are youth protecting themselves from victimization in the metaverse?
4) Do boys and girls have different experiences in the metaverse?

As mentioned, data were collected from a nationally representative sample of 5,005 English or Spanish-speaking middle and high school students between the ages of 13 to 17 in the United States. The average age of the sample was 15.1, with 52% identifying as girls and 47% as boys. The survey focused on students’ experiences with VR devices and metaverse environments, including the types of harm experienced, protective strategies employed, and potential differences in experiences between boys and girls.

The study findings align with observations from the incipient literature base about the potential for harm given the immersive nature of metaverse environments, as well as the personal connection users often feel with their avatars. A significant percentage of youth reported experiencing various forms of harm in these spaces, including hate speech, bullying, harassment, sexual harassment, grooming behaviors, and unwanted exposure to violent or sexual content. Furthermore, the study revealed notable gender differences in experiences. Girls, for instance, were significantly more likely to be subjected to sexual harassment and grooming behaviors and were more frequently targeted based on their gender.

Based on these findings, here are some of our recommendations for different stakeholder groups to enhance safety in metaverse environments (check out the paper for more details!).

kid-with-vr-glasses-experiencing-metaverse

Metaverse Platforms
We appreciate the development and implementation of safety features like “Space Bubble,” “Personal Boundary,” and “Safe Zone” so that users have more agency in restricting unwanted interactions and infringements upon their personal space. We’d like to see continued R&D in these areas to determine how to meet the needs of users in potential or actual victimization contexts. Additionally, platforms should streamline reporting mechanisms and ensure swift action is taken against perpetrators to build trust and confidence in their ability to address user safety concerns. Age-gating mechanisms continue to be refined to protect younger users from mature content and interactions, and we expect new developments in this area soon. Platforms should establish and enforce comprehensive community guidelines that explicitly address harassment and abuse, providing clear examples and explanations to remove grey areas that may be exploited unwittingly or intentionally. And, building improved content moderation systems using AI and machine learning techniques to detect, categorize, label, and separate harmless and harmful content can go a long way towards proactively managing any threat vectors.

As another key recommendation, platforms can help with education and awareness by developing user-friendly guides, short-form videos, and various interactive resources that youth-serving adults learn about metaverse safety features, parental controls, and strategies for engaging in productive conversations with teens about online safety. These resources can be disseminated through various channels, including safety centers on their corporate website, inbox messages, social media campaigns, and partnerships with schools and community organizations. Of course, co-designing and co-creating these initiatives with young people themselves is essential so that the outputs are engaging, relevant, and truly effective in accomplishing their intent.

Parents and Guardians
We encourage parents and guardians are encouraged to adopt an active mediation approach by engaging in open and supportive dialogue with youth about their metaverse experiences, cultivating digital literacy, and collaboratively establishing responsible tech use guidelines. It’s worth taking the time to familiarize themselves with available parental control features on VR devices and metaverse platforms to set boundaries, monitor activities, and restrict certain features as needed. Screen-sharing or mirroring a teen’s VR experience also can be an effective deterrent to inappropriate behavior and provide opportunities for real-time guidance and intervention.

Users
It is essential that youth understand and take advantage of the safety features available to them within metaverse experiences, including blocking, muting, and reporting functionalities, and should learn how to disengage from unsafe interactions effectively. Depending on the platform, exploring third-party blocklists also can help users proactively avoid encounters with individuals known for perpetrating harassment. Girls in particular may find it beneficial to utilize these tools to protect themselves from misogynistic behavior and online harassment.

Content Creators
The onus is on content creators to consider the ethical implications of their metaverse creations, ensuring that they promote inclusivity, respect, and discourage any form of harassment. Integrating design elements that nudge users towards positive social interactions, such as prompting respectful communication or rewarding prosocial behaviors, can contribute to a safer online environment. Content creators should prioritize inclusivity, and strive to make their virtual experiences accessible to users from diverse backgrounds, languages, cultures, and abilities. As another research-based strategy, implementing clear codes of conduct and facilitating a shared understanding of acceptable behaviors within specific virtual spaces can help foster a sense of responsibility and accountability among users.

Schools
We continue to champion the need to integrate updated, relevant, and accessible digital citizenship and media literacy modules into school curricula to provide youth with the necessary knowledge and skills to navigate VR and other emerging technologies safely and responsibly.

Here’s the bottom line. Our research found that a substantial number of youth in the United States engage with social VR spaces and other metaverse environments, but experience certain risks and harms to a nontrivial degree. There remains so much promise with these new technologies, but vigilance is required when it comes to the unique challenges they present, as well as the unique vulnerabilities that certain youth users may have. As such, it’s “all hands on deck” to build a safer and more inclusive metaverse as it continues to involve. We hope that these empirical findings spotlight the clear potential for risk and harm, and also motivate stakeholders to redouble their efforts to safeguard, support, and empower youth in these settings.

Image sources: Jessica Lewis 🦋 thepaintedsquare, Freepik

Leave a Reply

Your email address will not be published. Required fields are marked *