Most tools designed to fight misinformation are built with Western audiences in mind. But what happens when you take those same tools somewhere else?
My colleagues and I wanted to find out. We tested whether media literacy games work equally well across cultures by comparing two versions: one originally designed for Western audiences and another built specifically for Indonesian players. We found that whether these games work depends not just on the content, but on culture, design choices, and how people actually engage with them.
Our study was recently published in Nature’s Humanities and Social Sciences Communications, and the full paper is available via open-access. I also recently presented about this paper at the Society for Personality and Social Psychology’s Annual Convention. I’m excited to share the summary of our findings here.
Prebunking vs. Debunking
First off, I wanted to share what I mean by the prebunking strategies in these games. You’ve probably heard of debunking, which is correcting false claims after people have already seen them. This can be helpful, but teaching people how to recognize false claims before they are exposed shows even more promise. Prebunking is rooted in William McGuire’s inoculation theory from the 1960s: expose people to a weakened version of misleading arguments so they can build cognitive resistance before encountering the real thing.
It’s a promising idea, and research shows it can be highly effective. But most prebunking research has been done on Western, English-speaking populations. That’s a significant blind spot given how global misinformation has become. So we decided to test two very different games across two very different countries.
Two Games, Two Cultural Logics
Gali Fakta was designed specifically for Indonesia. It simulates a WhatsApp-style group chat where players receive messages from fictional friends and family and have to decide whether content is credible or not. You can play it here. Its tone is prosocial and community-oriented, which reflects how Indonesians actually encounter misinformation: through peer networks and direct messaging apps, not Twitter shouting matches. We previously found that when Indonesian participants played our game, they were better able to detect false information.
Harmony Square was built by and for Western audiences. Players become a “Chief Disinformation Officer” in a fictional town and actively spread misinformation (trolling, fearmongering, stoking polarization) in order to recognize these tactics later. You can play it here. It’s ironic and satirical, very much rooted in the kind of partisan, democratic-discourse-focused media environment familiar to American news consumers. It has been shown to effectively teach Americans how to spot manipulation techniques.
On paper, each game seemed well-matched to its home context. But what would happen when you crossed the wires?
What We Found
We surveyed nearly 1,600 participants across two studies— 799 in Indonesia and 790 in the United States—randomly assigning each to play Gali Fakta, Harmony Square, or Tetris (our control condition). So in the first study, our Indonesian participants were randomly assigned to the original Gali Fakta, and the Indonesian adapted Harmony Square. In the second study, American participants were randomly assigned to the original Harmony Square, and the American adapted Gali Fakta. Afterward, they evaluated real and false headlines and indicated whether they’d share them. We measured two outcomes: accuracy discernment (could they tell true from false?) and sharing discernment (were they less likely to share false headlines?).
In Indonesia, the culturally tailored game mattered. Gali Fakta improved sharing discernment: people who played it became less likely to share false headlines. Harmony Square had essentially no effect. And Indonesian participants rated Gali Fakta as significantly more engaging than Harmony Square.
The Western satirical game just didn’t land. This makes some sense: in Indonesia, spreading misinformation isn’t just culturally frowned upon, it can carry legal consequences. A game that asks you to roleplay as a disinformation agent probably hits differently there. Additionally, in the U.S., more conservative participants were consistently less accurate at identifying false headlines, but this pattern didn’t appear in Indonesia, where political ideology didn’t predict discernment at all. This makes sense given that Indonesian polarization tends to run along ethnic and religious lines rather than a clear left-right political divide.
In the United States, both games worked well. Harmony Square improved accuracy and sharing discernment, replicating previous findings. But here’s what surprised me: Gali Fakta, our Indonesian WhatsApp game, translated into English also improved both outcomes significantly. Neither game was rated as more engaging than the other.
We thought Harmony Square’s satire and political framing would resonate with American audiences, but were unsure if Gali Fakta’s WhatsApp design would. We found that Gali Fakta’s simpler, more socially familiar format seemed to travel just fine.
The Engagement Effect
One of the findings I find most compelling: engagement predicted effectiveness. In Indonesia, higher engagement with Gali Fakta strongly predicted better discernment. In the U.S., higher engagement predicted better outcomes for both games.
This suggests that the mechanism behind prebunking games involves active involvement over more passive exposure. If a game doesn’t engage you, it probably won’t change how you evaluate information. Cultural fit seems to drive engagement, and engagement drives learning.
What This Means
These results push back against the assumption that effective digital interventions can simply be translated and deployed globally. The satirical, politically dense format that works in the U.S. didn’t transfer to Indonesia. But Gali Fakta’s simpler, peer-based chat format worked in both countries.
Our findings suggest that what travels well across cultures isn’t sophisticated political satire, but familiar communication formats, simplicity, and prosocial values. In other words, the chat interfaces, helping your community, and recognizable social stakes.
A few important caveats: we used different headline sets in each country, which makes direct comparisons tricky. We only measured effects immediately after gameplay, so we don’t know how long the benefits last. And prebunking games tend to attract people already thinking about misinformation, which raises the broader question of how to reach those who aren’t.
As I regularly discuss here in Misguided, misinformation is a deeply social problem, and it spreads through relationships and trust networks, not just algorithms. These findings suggest that the most effective interventions might be the ones that meet people inside those social dynamics, rather than asking them to step outside of them.
A version of this post also appears on Misguided: The Newsletter.