Facebook versus the BMJ: when fact checking goes wrong
BMJ 2022; 376 doi: https://doi.org/10.1136/bmj.o95 (Published 19 January 2022) Cite this as: BMJ 2022;376:o95- Rebecca Coombes, head of journalism,
- Madlen Davies, investigations editor
- Correspondence to R Coombes rcoombes{at}bmj.com
On 3 November Howard Kaplan, a retired dentist from Israel, posted a link to a BMJ investigation article in a private Facebook group.1 The investigation reported poor clinical trial research practices occurring at Ventavia, a contract research company helping to carry out the main Pfizer covid-19 vaccine trial.2
The article brought in record traffic to bmj.com and was widely shared on Twitter, helping it achieve the second highest “Altmetric” score of all time across all biomedical publications.3 But a week after his posting Kaplan woke up to a message from Facebook (fig 1).
“The Facebook Thought Police has issued me a dire warning,” he wrote in a new post. “Facebook’s ‘independent fact-checker’ doesn’t like the wording of the article by the BMJ. And if I don’t delete my post, they are threatening to make my posts less visible. Obviously, I will not delete my post . . . If it seems like I’ve disappeared for a while, you’ll know why.”4
Kaplan was not the only Facebook user having problems. Soon, several BMJ readers were alerting the journal to Facebook’s censorship. Over the past two months the journal’s editorial staff have been navigating the opaque appeals process without success, and still today their investigation remains obscured on Facebook.
The experience has highlighted serious concerns about the “fact checking” being undertaken by third party providers on behalf of Facebook, specifically the lack of accountability and oversight of their actions, and the resulting censorship of information.
“Missing context”
Beginning on 10 November, The BMJ’s readers began reporting a variety of problems when trying to share its investigation on Facebook. Some reported being unable to share it. Many others reported having their post flagged with a warning about “Missing context . . . Independent fact-checkers say this information could mislead people.” Facebook told posters that people who repeatedly shared “false information” might have their posts moved lower in its news feed. In one private Facebook group, of people who had long term neurological adverse events after vaccination, group administrators received a message from Facebook informing them that a post linking to The BMJ’s investigation was “partly false” (fig 1).
Readers were directed to a “fact check” performed by Lead Stories,5 one of the 10 companies contracted by Facebook in the US,6 whose tagline is “debunking fake news as it happens.” An analysis last year showed that Lead Stories was responsible for half of all Facebook fact checks.7
The Lead Stories article said that none of the flaws identified by The BMJ’s whistleblower, Brook Jackson, would “disqualify” the data collected from the main Pfizer vaccine trial. Quoting a Pfizer spokesperson, it said that the drug company had reviewed Jackson’s concerns and taken “actions to correct and remediate” where necessary. A Pfizer spokesperson said that the company’s investigation “did not identify any issues or concerns that would invalidate the data or jeopardize the integrity of the study.” Lead Stories also said that Jackson did not “express unreserved support for covid vaccines” and had worked at the trial site for only two weeks.
No errors found
The Lead Stories article, though it failed to identify any errors in The BMJ’s investigation, nevertheless carried the title, “Fact Check: The British Medical Journal Did NOT Reveal Disqualifying and Ignored Reports of Flaws in Pfizer COVID-19 Vaccine Trials.”
The first paragraph wrongly described The BMJ as a “news blog” and was accompanied by a screenshot of the investigation article with a stamp over it stating “Flaws Reviewed,” despite the Lead Stories article not identifying anything false or inaccurate. Lead Stories did not mention that the investigation was externally peer reviewed, despite this being stated in the article, and had published its article under a URL that contained the phrase “hoax-alert.”5
The BMJ contacted Lead Stories, asking it to remove its article. It declined. The author of the article, Dean Miller, replied to say that Lead Stories was not responsible for Facebook’s actions.
“In the Facebook system, we flagged the article “Missing Context,” which is the lowest possible flagging category,” says Miller. “It’s my understanding Facebook Enforcement doesn’t throttle back distribution or traffic based on a ‘Missing Context’ rating. I may be wrong, but I believe the result is merely a flag on the content.”
Miller defended his article, noting, “We did not call into question the integrity of The BMJ’s story, only the comprehensiveness of it. That’s the point of a ‘Missing Context’ rating.
“We couldn’t agree more with you the public should be concerned, provided they have all the context, which is what we attempted to point out and, in some small way, provide as a supplement to The BMJ’s report.”
The BMJ based its story on dozens of original documents provided by the experienced clinical trial auditor turned whistleblower Jackson and was confident in the authenticity of her evidence. After publication, and as reported in a linked rapid response on bmj.com, The BMJ contacted Ventavia, Pfizer, and the US Food and Drug Administration (FDA) to better clarify the scope and implications of the problems identified at Ventavia and what corrective measures had been taken.8 At the time of going to press Ventavia had not responded to The BMJ’s repeated requests for information.
Pfizer told The BMJ that it had investigated an anonymous complaint about Ventavia in September 2020 and that “actions were taken to correct and remediate where necessary.” The FDA stated that it was unable to answer The BMJ’s questions, “as it is an ongoing matter.”
In a subsequent email, Alan Duke, editor in chief of Lead Stories, told The BMJ that the “Missing Context” label was created by Facebook specifically “to deal with content that could mislead without additional context but which was otherwise true or real.” He added that the article was widely being shared and commented on by antivaccine activists on Facebook. “We agree that sometimes Facebook’s messaging about the fact checking labels can sound overly aggressive and scary. If you have an issue with their messaging you should indeed take it up with them as we are unable to change any of it.”
The BMJ also contacted the International Fact-Checking Network (IFCN), run by the Poynter Institute for Media Studies, a non-profit journalism school in St Petersburg, Florida, whose donors include Facebook and Google.9 IFCN sets quality standards for fact checking organisations and creates a verified list of companies that meet these standards, including Lead Stories. Poynter referred The BMJ back to Facebook.
Gary Schwitzer, adjunct associate professor at the University of Minnesota’s School of Public Health and publisher of HealthNewsReview, which grades US news organisations’ health reporting, said there was an “inherent conflict of interest” in Facebook’s use of third party organisations to fact check content. “So a company facing a credibility crisis hires you to help them out,” he told The BMJ. “There is an inherent pressure on the contractor, then, if they want to be paid, to come up with problems and to appear to help solve them.”
He said the processes by which Facebook decided which content to send for fact checking, and the contractors’ systems for deciding which pieces they reviewed, were not transparent or consistent enough. A supposedly objective “fact check” in reality was “subject to individual reviewer opinion,” he added. Fact checkers often miss genuinely misleading stories, such as articles reporting relative rather than absolute risk, said Schwitzer.
Wider problem
Cochrane, the international provider of high quality systematic reviews of medical evidence, has experienced similar treatment by Instagram, which, like Facebook, is owned by the parent company Meta.
A Cochrane spokesperson said that in October its Instagram account was “shadowbanned” for two weeks, meaning that “when other users tried to tag Cochrane, a message popped up saying @cochraneorg had posted material that goes against ‘false content’ guidelines” (fig 1). Shadowbanning may lead to posts, comments, or activities being hidden or obscured and stop appearing in searches.
After Cochrane posted on Instagram and Twitter about the ban, its usual service was eventually restored, although it has not received an explanation for why it fell foul of the guidelines in the first place.
The spokesperson said, “We think Cochrane was reported as it had published a review on ivermectin and was ironically supporting a campaign about spreading misinformation. It seems sometimes automation and artificial intelligence get it wrong. And user reporting and mechanisms can be used to block the wrong people.”
In December The BMJ wrote an open letter to Mark Zuckerberg, Meta’s chief executive.10 In the letter, editors Fiona Godlee and Kamran Abbasi called Lead Stories’ fact checking “inaccurate, incompetent, and irresponsible.” It asked Meta to review the warning placed on The BMJ’s article and the processes that led to it being added and to reconsider its overall approach to fact checking.
Meta refuses to intervene
Meta directed The BMJ to its advice page, which said that publishers can appeal a rating directly with the relevant fact checking organisation within a week of being notified of it. “Fact checkers are responsible for reviewing content and applying ratings, and this process is independent from Meta,” it said. This means that, as in The BMJ’s case, if the fact checking organisation declines to change a rating after an appeal from a publisher, the publisher has little recourse.
The lack of an independent appeals process raises concerns, given that fact checking organisations have been accused of bias. “I worry about the amount of power placed in the hands of these third party groups,” says Jillian York, director for international freedom of expression at the Electronic Frontier Foundation, a non-profit organisation that promotes civil liberties in the digital world. “There’s no accountability structure. There’s no democratic process to this. And so, while I do see a role for fact checking and think it’s far superior to the alternative—which is Facebook just taking down content—I still worry about the effect that it can have on legitimate sources.”
In December Lead Stories wrote a response to The BMJ’s open letter to Mark Zuckerberg that implied that whistleblower Jackson was not a credible source.11
It said Jackson was not a “lab-coated scientist” and that her qualifications amounted to a “30-hour certification in auditing techniques.” Jackson has more than 15 years’ experience in clinical research coordination and management and previously held a position as director of operations. “I’ve never claimed to be a scientist,” she says. “The 30 hour course is not what qualifies me. All my years of having different roles in clinical trials is what qualifies me. Besides, someone new to clinical research would have noticed what was going on at Ventavia. It did not take an expert.”
Lead Stories also criticised The BMJ for failing to include Jackson’s “publicly expressed views of covid vaccines.” It pointed to tweets she had sent, all after The BMJ’s investigation. One criticised an episode in the children’s television show Sesame Street in which Big Bird gets a covid vaccine, and another expressed support for a US court ruling against making vaccination mandatory for federal employees. Lead Stories had highlighted the same tweets in its original fact check, saying that “on Twitter, Jackson does not express unreserved support for covid vaccines.”
“Since when is it the obligation of any citizen to show unreserved support for anything?” asked Schwitzer. “It’s absolutely immaterial to the topic at hand. For it to be in this independent review I think says more about the reviewer than the reviewee.”
Lead Stories is taking an editorial position on vaccination, York says, one that echoes Facebook’s own position. “The broader issue at hand is that companies like Facebook and some of the traditional media establishments are reasonably concerned about vaccine misinformation but have swung so far in the opposite direction as to potentially shut down legitimate questions about major corporations like Pfizer,” she said. The medical industry has a history of suppressing certain information, and citizens need to be able to question it, she added.
On 20 December Lead Stories also sent a series of inflammatory tweets after publishing its response to The BMJ’s open letter.11 It said, “Hey @bmj_latest, when your articles are literally being republished by a website run by someone in the ‘Disinformation Dozen’ perhaps you should be reviewing your editorial policies instead of writing open letters.”12
The tweet contained a picture of The BMJ’s article, which had been republished by Children’s Health Defense, an antivaccine website that questions the safety of vaccines and funds antivaccine adverts on Facebook. Lead Stories also asked questions about Paul Thacker, the author of The BMJ’s investigation and cited as such in the reposted article on the Children’s Health Defense website. Lead Stories tweeted, “Is @thackerpd really ok with being listed as an author on childrenshealthdefense.org? Or does he object to it? The answer will reveal a lot.”
Thacker did not write the piece for Children’s Health Defense. The website had republished articles of The BMJ without complying with its licence terms. The BMJ’s legal team has asked the Children’s Health Defense to take the articles down.
Checking the checkers
Fact checking is not a completely unregulated business. IFCN was set up in 2015 to advocate “for higher standards among the global fact-checking community.”13 More than 100 fact checking agencies from around the world are signed up to IFCN’s code of principles and are verified by it. Signatories range from what the IFCN calls the “big beasts of traditional media,” such as Le Monde’s Les Decodeurs in France and the Washington Post in the US, to global newswires AFP, AP, and Reuters, and start-ups such as Rappler in the Philippines.
The code’s first principle is a commitment to non-partisanship. “Signatories do not advocate or take policy positions on the issues they fact check,” it says. The BMJ has submitted a complaint to the Poynter Institute, which runs the IFCN, alleging that Lead Stories’ conduct does not meet this commitment and is awaiting a response.
The BMJ plans to appeal to Facebook’s Oversight Board, an independent panel of 20 people from around the world that can decide whether Facebook should allow or remove specific content. It reviews only a small number of “emblematic cases,” including upholding a decision made on 7 January 2021 to ban the then US president, Donald Trump, from posting on Facebook and Instagram after the storming of the Capitol Building in Washington, DC, in which five people died. The board’s decisions are binding unless implementing them could violate the law.
Carolina Are, an online moderation researcher and visiting lecturer at City University in London, backs The BMJ’s efforts. “The BMJ is a reputable news organisation that has a huge platform and the means to challenge this stuff. But there are a variety of creators on social media and online in general who just get their profiles outright deleted when this stuff happens,” she says.
Meanwhile, readers are still facing problems sharing The BMJ’s investigation on Facebook.
Kamran Abbasi, The BMJ’s editor in chief, said, “We should all be very worried that Facebook, a multibillion dollar company, is effectively censoring fully fact checked journalism that is raising legitimate concerns about the conduct of clinical trials. Facebook’s actions won’t stop The BMJ doing what is right, but the real question is: why is Facebook acting in this way? What is driving its world view? Is it ideology? Is it commercial interests? Is it incompetence? Users should be worried that, despite presenting itself as a neutral social media platform, Facebook is trying to control how people think under the guise of ‘fact checking.’”
Footnotes
Competing interests: See bmj.com/about-bmj/editorial-staff.
Provenance and peer review: Commissioned; not externally reviewed.
This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.