[ad_1]
Chandan Khanna/AFP via Getty Images
Researchers have found just 12 people are responsible for the bulk of the misleading claims and outright lies about COVID-19 vaccines that proliferate on Facebook, Instagram and Twitter.
“The ‘Disinformation Dozen’ produce 65% of the shares of anti-vaccine misinformation on social media platforms,” said Imran Ahmed, chief executive of the Center for Countering Digital Hate, which identified the accounts.
Now that the vaccine rollout is reaching a critical stage, in which most adults who want the vaccine have gotten it but many others are holding out, these 12 influential social media users stand to have an outsized impact on the outcome.
These figures are well known to both researchers and the social networks. Some of them run multiple accounts across the different platforms. They often promote “natural health”. Some even sell supplements and books.
Many of the messages about the Covid-19 vaccines being widely spread online mirror what’s been said in the past about other vaccines by peddlers of health misinformation.
“It’s almost like conspiracy theory Mad Libs. They just inserted the new claims,” said John Gregory, deputy health editor at NewsGuard, which rates the credibility of news sites and has done its own tracking of COVID and vaccine misinformation “superspreaders.”
The Disinformation Dozen’s claims range from “denying that COVID exists, claiming that false cures are in fact the way to solve COVID and not vaccination, decrying vaccines and decrying doctors as being in some way venal or motivated by other factors when they recommend vaccines,” Ahmed said.
Many of the 12, he says, have been spreading scientifically disproven medical claims and conspiracies for years.
Which provokes the question: Why have social media platforms only recently begun cracking down on their falsehoods?
Both members of Congress and state attorneys general have urged Facebook and Twitter to ban the Disinformation Dozen.
“Getting Americans vaccinated is critical to putting this pandemic behind us. Vaccine disinformation spread online has deadly consequences, which is why I have called on social media platforms to take action against the accounts propagating the majority of these lies,” Sen. Amy Klobuchar, D-Minn., told NPR.
Social networks crack down on COVID vaccine claims
The companies have stopped short of taking all 12 figures offline entirely, but they have stepped up their fight: they’ve labelled misleading posts. They’ve removed falsehoods. In some cases, they’ve banned people who repeatedly share debunked claims.
Facebook says it’s taken action against some of the figures identified by CCDH, several of which operate multiple accounts on its apps. That includes permanently removing 11 accounts from Facebook or Instagram and placing restrictions on 19 others, like preventing them from being recommended to other users, reducing the reach of their posts and blocking them from promoting themselves through paid ads.
“We reacted early and aggressively to the COVID-19 pandemic by working with health experts to update our misinformation policy to target harmful claims about COVID-19 and vaccines, including taking action against some of the accounts in the CCDH report,” said spokesperson Kevin McAlister in a statement. “In total, we’ve removed more than 16 million pieces of content which violate our policies and we continue to work with health experts to regularly update these policies as new facts and trends emerge.”
Twitter says it permanently suspended two of the Disinformation Dozen accounts for repeatedly breaking its rules, required other accounts to delete some tweets, and applied labels that link to credible information about vaccines and don’t allow the tweets to be shared or replied to. Overall, it’s removed more than 22,400 tweets for violating its COVID policies.
However, spokesperson Elizabeth Busby said Twitter distinguishes between “harmful vaccine misinformation that contradicts credible public health information, which is prohibited under our policy, and negative vaccine sentiment that is a matter of opinion.”
And so the Disinformation Dozen are still easy to find on social media.
‘Tried and true’ tactics
Sometimes they skirt the platforms’ rules by using codes.
“Instead of saying ‘vaccine’, they may, in a video, hold up the V sign with their fingers and say, ‘if you’re around someone who has been — hold up V sign — you know, X might happen to you,” Ahmed said.
Or, they take something true and distort it, like falsely linking a famous person’s death to the fact that they got a vaccine days or weeks earlier.
NewsGuard’s Gregory said a “tried and true” tactic of vaccine opponents is “grossly misrepresenting some sort of research, some sort of data to promote whatever narrative they’ve chosen.”
Facebook says it now limits the reach of posts that could discourage people from getting vaccinated, even if the messages don’t explicitly break its rules.
But the cat and mouse game continues.
Anti-vaccine activists claim censorship
As the social networks have cracked down, some previously prolific spreaders of vaccine misinformation have toned down their posts, and have told their followers they are being censored.
Take anti-vaccine activist Robert F. Kennedy Jr., who has promoted the long discredited idea that vaccines are linked to autism. During the pandemic, he has shared baseless conspiracy theories linking 5G cellular networks to coronavirus, and suggested, without evidence, that the death of baseball great Hank Aaron was “part of a wave of suspicious deaths” tied to vaccines.
None of that is true.
Kennedy was kicked off Instagram, which Facebook owns, in February for repeatedly sharing debunked claims.
Yet Facebook did not remove him from its namesake platform. He told NPR the company has flagged some of his posts, however, so he has become more cautious.
“I have to post, like, unicorns and kitty cat pictures on there,” he said. “I don’t want to give them an excuse.”
He also uses it to promote his website and newsletter, where he makes claims he cannot on the social network.
Kennedy said he’s never posted misinformation and accused Facebook of censorship. He says the crackdown has cost “hundreds of thousands of dollars” in donations to his organization.
A battle of persuasion
Even as the social media companies have gotten tougher recently on misinformation, researchers worry the persistence of vaccine-related hoaxes will further erode confidence among people who hesitate to get the shot.
That’s especially concerning as vaccines roll out for kids 12 and up.
In a survey of U.S parents, Indiana University sociologist Jessica Calarco found more than a quarter don’t plan to vaccinate their kids.
“So many of these moms are turning to Facebook, are turning to Twitter, are turning to other social media platforms” for news and information, she said. “And they’re saying, ‘Every time I open my phone, I see something different.'”
Even some parents whose kids have had routine childhood vaccines told Calarco they’re unsure about COVID jabs.
Facebook this week released survey data showing vaccine acceptance among adults in the U.S. has increased by 10 percent since January. However, its survey also shows that the top reasons people say they don’t want to get vaccinated are worries about side effects and lack of trust in the vaccines or the government — exactly the kind of fears anti-vaccination accounts promote.
The social networks say amplifying credible information from authoritative sources, such as the Centers for Disease Control and Prevention, is just as important as reducing the spread of harmful posts. Both Facebook and Twitter link to public health information in their apps and in the labels they put on misleading posts.
But they now face an uphill battle of persuading the skeptics.
Calarco says many of the parents she spoke with weigh the posts they see on social media “equally against the kinds of expert medical recommendations, expert medical information coming out of things like the CDC.”
Editor’s note: Facebook is among NPR’s financial supporters.
[ad_2]
Source link