Spreading disinformation across game ecosystems: from in-game events to metaverses
Video games and virtual worlds are no longer just entertainment, but now they are full-fledged social platforms where hundreds of millions of people interact daily, including children and teenagers with low levels of media literacy. In this space, not only entertainment content is easily and quickly distributed, but also disinformation, which traditional methods are no longer effective to combat. The article describes the features of this field and approaches to verifying fakes in virtual reality.
The audience of the games: not only teenagers, but also the whole family.
According to a report from the Entertainment Software Association (ESA, 2025), the gaming audience has become extremely diverse in age. The average gamer in the United States is a 36-year-old with 18 years of «gaming experience.» At the same time, 23% of the players are under the age of 18, and 28% are over 50. This age diversity complicates the fight against misinformation: young users are more vulnerable to manipulation, and adults are not always aware of the risks associated with virtual communication.
How does disinformation get into the game universes?
Gaming platforms provide unique channels for the dissemination of information, and attackers actively use this:
- In-game events: concerts, tournaments, holidays and quests attract thousands of participants. If a false narrative is broadcast during such an event, users often perceive it as «official» because of their trust in the brand or the organizer. And in order to achieve an even stronger psychological effect, various additional features can be used.
In 2021, the attackers conducted a phishing campaign using the brand of the tournament operator WePlay. They created fake websites and sent invitations to Dota 2 players via social networks and messengers in order to get their Steam login details under the pretext of registering for the tournament.
- Specific communication tools: Virtual worlds use spatial audio, avatar gestures, writing in the air, and other forms of interaction, making it difficult to monitor and collect evidence.
For instance, players can talk to each other in whispers, so speech can only be made out by approaching the players.. They can use simulated movements of the head, arms, or other body parts, as well as virtual handwriting tools (drawing with crayons in the air).
- Chats: Text and voice messages in games are instantly distributed between participants. Emotional coloring, avatar appearance, sound effects, and even tactile feedback (such as vibration of controllers) enhance the credibility of the message.
One of the most high-profile examples of misinformation in the gaming environment is related to Roblox fraud. Scammers used Telegram and in-game chat rooms to lure children with promises of free in-game currency Robux in exchange for completing «simple tasks.» In fact, they asked the child to use his parents’ or grandmother’s phone to take a picture of the screen with an open banking application or pages with card details. Having gained access to this information, the attackers stole money or issued loans without the knowledge of the account holders.
- Online «bridges»: misinformation migrates from games to Discord, Twitch, TikTok, YouTube and back, increasing its influence due to virality and algorithms of social networks
A similar incident occurred in May 2022: former US General Barry McCaffrey posted a video purporting to capture a Ukrainian fighter shooting down a Russian plane. He soon deleted the recording, admitting that it was not a real video, but a simulation from the Digital Combat Simulator (DCS) aircraft simulator — sometimes mistakenly attributed to the same Arma 3 game. Such cases show how easily game content can be put into circulation as «proof» of real events, especially in an information panic.
The problems of fact-checking in gaming environments
Verifying information in games faces a number of fundamental difficulties:
- Ephemeral content: chats are often not saved, and messages disappear after exiting the session.
- Multimodality: information is transmitted not only by text, but also by voice, avatar movements, and visual scenarios — classic fact-checking tools do not work here.
- Limited access to data: Many platforms do not provide APIs or chat logs, especially in private rooms.
- Legal and ethical restrictions: working with children and adolescents requires strict confidentiality standards.
- High speed of distribution: false narratives can instantly reach thousands of users, especially if they are embedded in streams or in-game events.
Strategies to fight disinformation in games
An integrated approach combining technical tools, methodology, and collaboration with platforms is needed to effectively counter disinformation in the gaming universes. More detail:
1. Collecting digital evidence. Screenshots, videos, audio files, and metadata (time, server, and nicknames) are the basis of verification.
2. Cross-validation. Comparing in-game events with social media posts and streams helps establish the authenticity of the content.
3. Cooperation with platforms. Working through official requests, reviewing moderation
transparency, and participating in pilot projects to combat disinformation.
4. Educational measures. The inclusion of media literacy elements directly into the gameplay.
5. Inoculation through games. Online games like Bad News, Harmony Square, and Cat Park help users protect themselves from manipulation by immersing them in fake news scenarios. After passing through such a «vaccination», players better understand the manipulators’ tricks and begin to be more critical of information.
However, such games are not without risks: they can carry a hidden ideological burden or distort the perception of the media space. This highlights the need to create localized, culturally tailored solutions.
Games and virtual universes are not just «another channel», but a special information dissemination system that requires appropriate attention, resources and approaches. Only with these features in mind is it possible to build an effective system of fact-checking and information stability in this area.