Blog Post

August 10, 2022

Gaming in the metaverse: online safety in another dimension

The metaverse is more than just a tech buzzword, it will likely have an impact on the way we socialise, play, work, and learn. Exactly what that impact is - good or bad - remains undefined. By asking the right questions about safety and risk in the metaverse, we can begin to think about answers to those questions. In this blog, we will look at one of the most developed sectors in the metaverse - gaming - and provide some key considerations and questions for identifying risks in metaverse gaming before they grow in scope and severity.

The metaverse and metaverse gaming

Imagine putting on a VR headset and navigating various worlds through your avatar — an avatar which may look exactly like you. You can game with thousands of others in virtual worlds, buy and sell assets in a functioning digital economy enabled by non-fungible tokens (NFTs), and interact with your friends in a way that closely resembles social interaction in the physical world. 

It sounds like sci-fi but the metaverse, some argue, is simply the next generation of the internet. Still in its infancy, there is no universal definition of the metaverse and there are different visions of what it is or may become. One of the best ways to approach the metaverse and its challenges is through one of its most well-developed sectors - gaming. 

Gaming is expected to account for over half of the metaverse’s market value by 2024. At its core, gamers have been early adopters of ‘metaversal’ activities. For example, Fortnite players are already familiar with massive multiplayer experiences as millions of players roam different islands, create their own worlds and compete to be the last player standing. A study by Newzoo showed that 38% of gamers between the ages of 10-20 played proto-metaverse games like Fortnite, Roblox and Minecraft in 2021. 

Lastly, gamers on both traditional and metaverse platforms skew disproportionately towards children and young people. This looks like a trend that will continue as 56% of gamers aged 13-17 expect their time spent playing metaverse games to increase. As PUBLIC’s previous research highlights, children and young people are already increasingly exposed to illegal or harmful content online (online harms), making gaming the ideal case study for exploring online safety in the metaverse. 

Online safety 

While offering exciting opportunities for innovation, the metaverse poses new challenges to a range of issues. Concerns around digital identity, competition, privacy and inclusion are just some of the risks that could emerge and that require attention. However, given PUBLIC’s extensive work to date in online safety, we have focused our thinking in this blog on safety in the metaverse. We want to help the wider Trust & Safety ecosystem ensure that collectively we work for a dynamic metaverse that is safe for all users. PUBLIC is uniquely positioned to contribute to solutions given our experience in developing taxonomies, understanding harm types, convening relevant stakeholders, and promoting the development of technologies or solutions to facilitate safer online experiences, and protect users from harmful content, contact or conduct.

As the adoption of metaverse elements in gaming accelerates, online safety is one of the key areas of concern. Are people more vulnerable to mis/disinformation if they are spatially surrounded by it? How do cryptocurrencies and NFTs impact how Child Sexual Abuse Material is bought and stored? How do hyper-realistic recreations of traumatic situations — like combat or horror — impact children? 

With competing conceptions of the metaverse, no one knows exactly how the metaverse will develop. Indeed, a preoccupation with defining the metaverse may obstruct effective understanding of the safety challenges and identifying the solutions. Instead, the uncertainty should be accepted so we can focus on the relevant questions that will need answering if, or when, the metaverse becomes the next step in our digital evolution. Importantly, this also offers the opportunity to convene the wider ecosystem of policymakers, technologists and Trust & Safety experts to preempt potential risk areas before they grow in scope. 

Key considerations 

It is important for us as an ecosystem to structure our approach to online safety in metaverse gaming. To reach a broad understanding of the risks posed, considering the scope of people that could be exposed to harm and the severity of that harm is necessary. A similar approach could also be used to understand other metaverse sectors, such as dating, or other risks posed by the metaverse such as privacy, competition and inclusion. 

Before we begin to try and understand how metaverse gaming can be harmful, we first need to create a shared language and understanding of the sector’s building blocks. This is where a taxonomy fits in. Developing a robust taxonomy allows us to break down the sector systematically and facilitate a more granular and sensitive assessment of risk. 

To do this we are first posed with a definitional dilemma. What exactly is a ‘metaverse game’? Establishing clear definitions is important for any taxonomy but it is particularly important here where the threshold between a conventional online game and a metaverse game is challenging. Is it the hardware — like VR headsets and haptic technologies — that mark the difference? Is it levels of immersion? Or is it the nature of gaming that changes in the metaverse? For example, whilst in regular gaming you can buy wearables for your avatar, in metaverse gaming there is more  potential to earn profit. Players can sell assets won in the game to other users for crypto profits. There may also be a stronger emphasis on socialisation which occurs outside of the game itself, such as attending social events in the virtual gaming environment. 

It is possible to construct a taxonomy of metaverse games based on these ‘metaversal’ characteristics. We can categorise games on the basis of the presence of different metaverse features. What exactly those features are merits further research, but categories could be based on use of haptics, levels of immersion, or reliance on a digital economy. Another possibility is categorising according to genre. For example, games with non-linear gameplay and a strong emphasis on user-generated content (sandbox games) may be more relevant in the metaverse than games with little involvement from the player, often in a single player mode (idle games).

When identifying the greatest levels of risk within metaverse gaming, we need to consider the scope of users. The sheer volume of users playing different types of games — broken down by a shared taxonomy — is one of the things that can be taken into account for identifying risk. However, it is not just the number of people using different types of games that are important to understand, but also user demographics. In line with the Online Safety Bill, we should prioritise age and vulnerabilities when considering risk. Disproportionate use of metaverse games by more vulnerable user groups is therefore another factor to bear in mind. 

At this point, however, user adoption of metaverse gaming is difficult to gauge. Despite gaming being amongst the most developed metaverse use cases, it still has some way to go before reaching the levels of adoption that gaming platforms hope for. Therefore, current usage of metaverse games may well not be the same now as it is five or ten years from now. 

Statistics on the usage of traditional gaming could be used as a proxy to assess how well it will translate into a metaverse context. For example, shooter games were the most popular genre among 16-24 year old gamers in 2021. Whether this continues to be the case for metaverse gaming remains to be seen. Monitoring of user adoption, behaviour and demographics will be critical over the coming years to ensure that the scope of users can be adequately considered.

Once we have scoped out metaverse maturity against a taxonomy, we should also consider the types of harm and the severity of harms that can emerge in metaverse gaming. To begin this task we should try and understand how harm might be different in the metaverse compared to traditional online spaces.

Take, for example, abuse, bullying and harassment, which we know is already occurring in the metaverse. Does the ability of perpetrators to physically approach and virtually intimidate someone change their impact on  victims and make it more severe? There is research that suggests that our brains respond to perceived threats in immersive VR environments in a similar way to how we would in real life. It is therefore possible that the severity of some types of harm, which we at PUBLIC have been involved in taxonomising in view of the Online Safety Bill, might be considered differently in the metaverse as opposed to traditional online spaces. 

Not only may existing harms be more severe, we should also ask whether there are entirely new types of harm that are possible in metaverse gaming as a result of the use of metaversal features. Does involvement in digital economies and the use of cryptocurrencies carry specific online harms? Do genres like Massive Multiplayer Online Role-Playing Games, where thousands game together at the same time, expose users to new type of risk? Further research is required to better understand how online harms emerge in metaverse gaming and how they can impact users. 

Investigating whether it is the characteristics of a metaverse game, its genre, or other factors that enable different online harms is crucial for us to assess risk. By trying to answer some of the questions we have laid out in this blog we can begin to consider how online safety risks in metaverse gaming could be appropriately identified and mitigated.

Conclusion

As metaverse gaming grows from its position today as an interesting minority-interest innovation to potential future mainstream adoption by gamers, we have the opportunity to ensure it is a safe environment as it develops. This blog has raised a series of questions and considerations for Trust & Safety stakeholders. Recognising the nascency of metaverse gaming, the Trust & Safety ecosystem should be open-minded to different approaches and we hope this blog provides some suggestions for starting points. The considerations we have raised for online safety in metaverse gaming should also be seen as the basis of an approach for other metaverse sectors and risk areas. 

Our next blog will look ahead to possible interventions to tackle online harms in metaverse gaming, from upstream “safety by design” approaches to downstream monitoring, moderation and user support. If you’re working on solutions to tackle these challenges, we’d love to hear from you as we prepare our next blog. Get in touch with James at james.katz@public.io and Emelie at emelie@public.io.

Partners

No items found.
Photo by the author

James Katz

Manager

Explore more insights

Stay in the loop!

Sign up to our monthly newsletter to get a snapshot of PUBLIC’s impact across the public sector, thought-leadership from our experts, and opportunities to get involved!