3rdPartyFeeds

Google is attempting to ‘inoculate’ people against misinformation using pre-emptive ‘pre-bunking’ strategies

Strategy more effective at fighting misinformation than fact-checking each untruth after it spreads, scientists say Read More...

Google is attempting to “inoculate” people against harmful content on social media by pre-emptively debunking misinformation or conspiracy theories using “pre-bunking” strategies.

The online experiment led by the University of Cambridge in the UK exposes users to tropes at the root of malicious propaganda via short animations so that they can better identify online falsehoods regardless of the subject matter.

In the study, published in the journal Science Advances on Wednesday, researchers, including those from Google’s Jigsaw unit, exposed people to 90-second clips designed to familiarise users with manipulation techniques.

When these clips are deployed in YouTube’s advert slot, scientists say they can help “inoculate” people against misinformation or conspiracy theories.

Scientists say this strategy is more effective at fighting misinformation than fact-checking each untruth after it spreads since it is impossible to do at a larger scale.

Researchers compare the effort to vaccination, adding that it gives people a “micro-dose” of misinformation in advance to prevent them from falling for it in the future – an idea that is based on what social psychologists call the “inoculation theory”.

In a total of seven experiments involving nearly 30,000 participants overall, scientists showed a single viewing of a film clip increases awareness of misinformation.

Researchers collected basic information from the participants, including gender, age, education, political leanings, levels of numeracy, conspiratorial thinking, news and social media checking, “bullshit receptivity”, and a personality inventory, among other “variables.”

The videos, scientists say, show concepts from the “misinformation playbook”, illustrated with relatable examples from film and TV such as Family Guy, or in the case of false dichotomies, Star Wars.

They found that the inoculation videos improved people’s ability to spot misinformation, and boosted their confidence in being able to do so again.

“YouTube has well over 2 billion active users worldwide. Our videos could easily be embedded within the ad space on YouTube to prebunk misinformation,” study co-author Sander van der Linden at Cambridge said in a statement.

“Our research provides the necessary proof of concept that the principle of psychological inoculation can readily be scaled across hundreds of millions of users worldwide,” Dr van der Linden said.

YouTube’s parent company Google is already harnessing the findings.

A “pre-bunking” campaign is expected to be rolled out across several platforms in Poland, Slovakia, and the Czech Republic to get ahead of emerging disinformation relating to Ukrainian refugees, researchers say.

The campaign is aimed at tackling harmful anti-refugee narratives, in partnership with local NGOs, fact-checkers, academics, and disinformation experts.

“Harmful misinformation takes many forms, but the manipulative tactics and narratives are often repeated and can therefore be predicted,” study co-author Beth Goldberg said.

“We’ve shown that video ads as a delivery method of pre-bunking messages can be used to reach millions of people, potentially before harmful narratives take hold,” she added.

While fact-checking only rebuts a fraction of the falsehoods circulating online, researchers say the new technique can teach people to recognise the misinformation playbook and help them understand when they are being misled.

In the study, scientists also tested two of the videos “in the wild” as part of a vast experiment on YouTube, with clips positioned in the pre-video advert slot.

Google exposed around 5.4 million US YouTubers to an inoculation video, with almost a million watching for at least 30 seconds.

A random 30 per cent of the users who watched the clip were given a voluntary test question within 24 hours of their initial viewing.

A “control” group of users who had not viewed a video were also given the same test question.

Researchers found that the videos “improve manipulation technique recognition, boost confidence in spotting these techniques, increase people’s ability to discern trustworthy from untrustworthy content, and improve the quality of their sharing decisions”.

They say the technique could be “game-changing” if dramatically scaled up across social platforms with an average cost for each view of significant length being a tiny sum of $0.05.

“If anyone wants to pay for a YouTube campaign that measurably reduces susceptibility to misinformation across millions of users, they can do so, and at a miniscule cost per view,” study co-author Jon Roozenbeek from the University of Cambridge said.

Read More