With the spread of social media, conspiracy narratives have gained a powerful tool. At the same time, efforts to counter such beliefs seemingly miss the mark. That is because we usually took to the wrong tool box in our countering efforts, researchers from Cambridge Social Decision-Making Laboratory say. On the basis of their studies, they have created several online games to “prebunk” rather than debunk misinformation.
“I think you are looking at this the wrong way” – would probably be the first thing I’d respond, when crossing paths with a conspiracy narrative. That is if I am capable of saying the polite thing, something to keep the conversation going. Keeping to myself what I believe is probably laughable, absurd and, let’s face it, utterly ridiculous. Not my strongest trait, I fear.
The problem is, others wouldn’t necessarily agree with my perception of their claims. It’s this basic conundrum that people – especially those living outside your social bubble – may well sustain and endorse views and opinions that differ from ours or even corroborate reality with “alternative facts”.
So what you and I might find absurd and utterly ridiculous to them shapes their understanding of the “real world” and what is “true”.
Technically, we are not unlike each other in that sense. We gain knowledge through evidence, values etc. and use it to create and substantiate our views. The problem is: both sides here think they are right. The others have just not seen the light, yet. It is a North and South Pole situation. And at both sides of the factual world it’s quite set in stone what we believe in and who is right and who is wrong.
That makes finding common ground in society an essential challenge.
Especially since truth may be found, at least on one side of the world. At least as long as “right” refers to an understanding based on actual facts and scientific evidence but also a lot of grey areas and uncertainties. In some cases, the people on the other side have gone down a rabbit hole and lost themselves in the clear-cut world of a conspiracy narrative. Aside from the usual doomsday scenarios that they draw from, life is somewhat easier there – all is pretty much painted in black and white. There is no “you are looking at this the wrong way”, at least not as a matter of self-introspection.
That’s why so-called debunking of conspiracy narratives is hard work, a work that tends to not pay off. As researchers from the Cambridge Social Decision-Making Laboratory assert, debunking is especially hard as conspiracy narratives play out at the emotional level of individuals, not the informational. People who have fallen for a conspiracy narrative are not open to counter-arguments based on facts, Pia Lamberty has told us.
“We (…) know that viral information tends to stick, that repeated misinformation is more likely to be judged as true, and that people often continue to believe falsehoods even after they have been debunked”, researchers Jon Roozenbeek, Melisa Basol, and Sander van der Linden write in an article for Behavioral Scientist.
Conspiracy narratives, as we have discussed on this blog, are tales as old as time. So it comes as no surprise that science has been pondering about it and continues to do so. Why do people fall for such narratives? How do they become drawn in or even radicalized? And most importantly – how can we bring them back?
Here’s the thing, the Cambridge researchers have found, it’s not so much about bringing them back. We should rather invest in prevention and preparedness to avoid people getting drawn into the lure of a conspiratorial tale.
Their studies show, “prebunking” might be worth a try. That is, looking into ways to prepare people for recognizing fake news, misinformation, and conspiracy narratives online – so that they do not fall subject to such forms of persuasion.
To further that goal, they have implemented their research on the so-called inoculation theory in a popular tool: online games. In a publicly recognized endeavor (the BBC, the Guardian, Forbes – everyone was reporting on it), the lab left the infamous ivory tower to bring their research to you and me. So far, they created “GoViral!”, “Bad News” and “Breaking Harmony Square”. Most of them are set in the social media world we are all familiar with. In the games, you are confronted with misinformation on your timeline or presented with ways to fabricate and spread fake news to build a stable followership.
The rationale behind these games? When you understand the mechanisms at play, you are trained to identify actual fake news or misinformation online – and less likely to fall for a conspiracy narrative. Lead researcher Sander van der Linden calls it a psychological vaccine.
In Behavioral Scientist the researchers go on to explain: “So instead of telling people what to believe, we created these games to equip players with the skills necessary to identify, argue against, and prevent harmful misinformation from going viral.”
Your mind is exposed to the structures and dynamics of conspiracy narratives, so that you may not experience a severe course of followership to such a narrative. You are resilient to the false claims that you are exposed to online.
The games prove to have a positive effect. In a recent study published in Big Data & Society, authors Melisa Basol, Jon Roozenbeek, Manon Berriche, Fatih Uenal, William P. McClanahan, Sander van der Linden report that their game GoViral! strengthened people’s perception that (mis)information about COVID-19 may be manipulative, increased their confidence to identify misinformation and reduced their inclination to share and spread it with others.
By Anna Hörter