Intended for healthcare professionals

News

Vaccine safety: Russian bots and trolls stoked online debate, research finds

BMJ 2018; 362 doi: https://doi.org/10.1136/bmj.k3739 (Published 30 August 2018) Cite this as: BMJ 2018;362:k3739
  1. Owen Dyer
  1. Montreal

Russian internet bots and trolls—including some of those indicted by special counsel Robert Mueller for interfering in the 2016 US election—have been stoking divisions by creating fake online debate about vaccine safety, say researchers whose study looked at nearly two million Twitter messages posted from 2014 to 2017.

These efforts included a Twitter campaign under the hashtag #VaccinateUS that was entirely created by the St Petersburg based Internet Research Agency, an entity charged in a February indictment by a US grand jury.

Known Russian trolls, identified in lists compiled by the US Congress and NBC News, were over 20 times as likely as average Twitter accounts to tweet about vaccines, said researchers writing in the American Journal of Public Health.1

The lead author, David Broniatowski of George Washington University in Washington, DC, told The BMJ that the researchers, who were working on a National Institutes of Health grant to study vaccine messaging in the US, became intrigued by the discrepancy between polling, which suggests broad public support for vaccination, and the online environment, where scepticism predominates.

They found that “a full 93% of tweets about vaccines are generated by accounts whose provenance can be verified as neither bots nor human users yet who exhibit malicious behaviours.”

The lists of known Russian tweeters accounted for only a tiny fraction of the tweets the team studied. In most cases, whether an account is human or bot can never be determined, but this is estimated by algorithms based on account activity, which give a “bot score.”

Besides the known Russians, the most prolific vaccine related tweeters were accounts with intermediate bot scores—a category likely to include the most sophisticated bots, as well as paid human trolls and accounts that mix human and bot activity.

Commercial bots known as “content polluters” were also frequent antivaccine tweeters, with nearly twice the rate of the average account. Rather than advancing a political agenda, the researchers suggest, these bots were using a provocative subject to draw clicks, aiming to attract users to unsolicited advertising or malware.

Known Russian trolls were far more balanced, presenting pro- and antivaccine messages to draw US users into the debate and sow political division, said Broniatowski.

Messages in the #VaccinateUS campaign were often explicitly political, tying vaccination to controversial themes such as religious freedom and immigration. A measles outbreak among Somali immigrants in Minnesota was mentioned several times. And antivaccine messages often mentioned conspiracy theories about the US government that are popular among vaccine sceptics.

“At first our government creates diseases then it creates #vaccines. What’s next?” asked one #VaccinateUS tweet. “Did you know there was a secret government database of #vaccine-damaged children?” asked another.

Some messages seemed designed to foment class division. One suggested that, “Apparently only the elite get ‘clean’ #vaccines. And what do we, normal ppl, get?! #VaccinateUS.”

Pro-vaccine messages were sometimes provocative, such as: “I believe in #vaccines, why don’t you? #VaccinateUS.” Another proclaimed, “Your kids are not your property! You have to #vaccinate them.”

Vaccination outside the US was also a theme, said Broniatowski. “There were plenty of tweets about vaccines in Africa and South East Asia, and Italy was one that came up a lot,” he said. Italy’s new populist government recently removed mandatory vaccination requirements for kindergarten pupils after the Five Star movement made this a campaign promise in the recent election.2

Broniatowski warned that vaccine advocates who engage antivaccine tweets could be “feeding the trolls” by helping to create an “astroturfed” debate where actually there is broad consensus. “If you see a message that seems designed to provoke a response, giving one could well be playing into their hands,” he said.

The researchers did not look at Facebook, another forum known for widespread antivaccine messaging. Most of the #VaccinateUS tweets have since been deleted as Twitter works to close any malicious accounts. That campaign ultimately failed to gain much traction, with few responses from outside the Internet Research Agency network.

But the effort probably continues, says Broniatowski, who saw no sign of tailing off during the period studied. “This goes well beyond the 2016 election,” he said. “This campaign began before a single candidate had declared, and I see no reason to believe they’ve stopped now.”

References

View Abstract

Log in

Log in through your institution

Subscribe

* For online subscription