As the US Presidential election comes ever closer, incendiary memes on social media have been emerging as a means to further drive a wedge between left- and right-wing voters.
The online disinformation campaign is so well disguised, the US Senate Intelligence Committee is worried Facebook users might unwittingly share foreign propaganda.
A new study by the defence think tank RAND Corporation, sponsored by the California Governor's Office of Emergency Services, has found a way to possibly mitigate that threat going forward.
The randomised controlled trial included more than 1,500 Facebook users on both sides of the political aisle. While not published in a peer-reviewed journal, it is one of the first studies to test audience reactions to actual foreign propaganda.
In the trial, participants were shown and asked to rate real news headlines, false news headlines, and Russian propaganda memes, both when the source was a mystery and when the source was labelled. They were then asked whether they would like or share the post.
The bad news is that oftentimes, these memes hit their target, causing a strong emotional and partisan response, especially among hard-left and hard-right American voters, who tended to rely on information from The New York Times or Fox News respectively.
But there's also more positive news. When participants were shown social media content with labels, revealing Russia as the source, they were less likely to 'like' or share the post.
"It is difficult to assess the degree to which revealing the source is a feasible intervention," the authors admit, but that said, their findings suggest there might be "immense value in developing a third-party plug-in that can unmask the source of state-sponsored content."
In the meantime, providing a more generalised warning about Russian propaganda might be an easier and less costly step to mitigate its spread.
When far-right participants in the study were shown a brief video on how to assess the veracity of information online, the study found they were less likely to hit 'like' on the meme.
This suggests media literacy might be a way to inoculate some Facebook users against disinformation. Although it didn't work for all participants. The media literacy video had no such effect on left-leaning users, for instance.
Even still, researchers at RAND think there might be some utility in warning people to be highly suspicious of online memes, their sources, and their intent.
"It also might be possible to inoculate audiences against Russian propaganda by pairing the warning with a weakened example of a Russian propaganda meme and providing directions on how to refute the meme," the authors suggest.
This idea needs to be tested further, but it lines up with what Cambridge University researchers are finding with an online 'Bad News' game that teaches people how to think like a propagandist. After playing, people are on average 21 percent better at determining the reliability of news.
Of course, this problem can be avoided entirely if social media firms quickly detect and remove propaganda and fake news. But right now, that's something they are failing to do.
Fake news about COVID-19 is rampant right now and the vast majority of these misleading posts have escaped regulation and lack appropriate warnings.
The current study is only an exploration of possible solutions, but it suggests that teaching far left and far right users to recognise propaganda might keep them from spreading it further.
"Left- and right-wing audiences are particular targets of Russian propaganda efforts, so they naturally have a big reaction because the propaganda speaks to them," says behavioural scientist Todd Helmus who works at RAND, the non profit policy think tank that conducted the research.
"A big reaction also means a lot of room for improvement in terms of learning how to think critically about the source of a particular news item, and the consequences of passing it along to others."
The report was conducted by the RAND Corporation.