3rdPartyFeeds News

Russian trolls may have failed in their mission to divide America, but this is no feel-good story

Researchers looked at Twitter users’ political leanings before and after they interacted with troll accounts. Read More...

Russian trolls trying to spur strife on Twitter in 2017 didn’t appear to sway any hearts and minds, because the users interacting with the bogus accounts were already so politically polarized.

‘These findings suggest that Russian trolls might have failed to sow discord because they mostly interacted with those who were already highly polarized.’

Duke University study

The sham accounts associated with the Kremlin-linked Internet Research Agency produced “no substantial effects” on the attitudes of certain Democratic and Republican Twitter TWTR, -0.83% users about their politics or the people on the other side of the aisle, according to a study released Monday by Duke University researchers.

The study showed interactions such as following a troll or liking a troll’s tweet didn’t really influence such prospective outcomes as someone’s feelings about a close relative marrying someone with opposing politics, or which accounts a user started to follow.

“These findings suggest that Russian trolls might have failed to sow discord because they mostly interacted with those who were already highly polarized,” researchers wrote in the study, which was based on surveys and Twitter data.

But the findings weren’t any reason to dismiss worries about foreign influence on social media, researchers emphasized.

The sample size of approximately 1,200 people was small and nonrepresentative; besides, only 20% of that pool interacted with the troll accounts. Furthermore, the study couldn’t say whether Russians succeeded in their efforts to inject misinformation and political division ahead of the 2016 election of Donald Trump over Hillary Clinton, which the U.S. intelligence community and the Mueller report have suggested was the Kremlin’s preferred outcome.

It’s also possible that Russian social-media influence and misinformation campaigns have since “evolved to become more impactful,” researchers noted.

At least one expert sounded that warning as an impeachment inquiry into President Trump continues over his alleged pressure campaign seeking from Ukrainian officials the announcement of an investigation into the son of Democratic presidential candidate Joe Biden. Trump denies the accusation and has called it a partisan “witch hunt.”

Fiona Hill, a former top aide to then–national security adviser John Bolton, testified at impeachment hearings that American officials were “running out of time” to stop Russian agents and their proxies from meddling in the 2020 election.

Duke political-science professor D. Sunshine Hillygus, one of the study’s authors, emphasized to MarketWatch that “we are not at all suggesting Russia’s attempt to interfere is not problematic. Like most people, we are horrified by it.”

She added, “We are trying to be very careful in what the take-home messages are. What we wanted to emphasize is there’s been an assumption about the impact [of Russian accounts] on the public and counter that.”

The study was a reminder of America’s deeply entrenched political polarization, and the findings were consistent with previous research “showing the difficulty of altering people’s political views,” researchers said. But Hillygus said the research also suggested people couldn’t be easily swayed by the last thing they heard. “We need to give credit to the American public: They are not entirely susceptible to propaganda,” she said.

Online persuasion, or the lack thereof

The study adds another layer of research on political beliefs and persuasion at a time when Americans are deeply divided, immersed in social media and volleying accusations of “fake news.”

The Duke researchers said the people most likely to interact with the troll accounts had already built up the strongest social-media “echo chambers” reinforcing their political beliefs.

That’s in line with other research on “confirmation bias” and the psychology behind “fake news.” Kent State University researchers found that in uncertain times, many people have a powerful urge to confirm their views by surrounding themselves with reports and opinions that support their existing views.

The Duke study participants’ average age was 50, which is above the country’s 2018 census average age of approximately 38.

Don’t miss: Trump impeachment inquiry and a brutal 2020 election could further stoke ‘toxic’ workplaces, HR experts warn

Previous research from Princeton University found that while 90% of survey participants didn’t share fake articles on Facebook FB, -0.18% during the 2016 presidential, 18% of those who did were identified as Republican and were over 65. (Facebook and Twitter have removed scores of fake Russian-linked accounts.)

The latest study from Duke noted that “Republicans appeared more likely to interact with IRA accounts than Democrats, [but] this association is not statistically significant.”

Duke researchers worked with Twitter’s Elections Integrity Hub to identify users who interacted with Internet Research Agency accounts.

Twitter announced last month it was banning most political advertisements.

On Monday, a Twitter spokeswoman said the site has a massive public archive of foreign information operations that now includes 30 million tweets.

“It is our fundamental belief that this activity should be made public and searchable so members of the public, governments, and researchers can investigate, learn, and build media literacy capacities for the future. We are proud that hundreds of researchers have made use of these datasets to conduct their own investigations and shared their insights and independent analyses with the world,” the Twitter representative said.

div > iframe { width: 100% !important; min-width: 300px; max-width: 800px; } ]]>

Read More

Add Comment

Click here to post a comment