Written by: Sheera Frenkel
At the start of the pandemic, a group of Facebook data scientists held a meeting with executives to request resources to help measure the prevalence of misinformation about COVID-19 on the social network.
Data scientists said figuring out how many Facebook users saw false or misleading information would be complex, possibly taking a year or more, according to two people who attended the meeting. But they added that by hiring new hires on the project and reassigning some existing employees there, the company could better understand how incorrect facts about the virus spread on the platform.
Executives never approved the resources, and the team was never told why, according to people, who requested anonymity because they were not authorized to speak to reporters.
Now, over a year later, Facebook has been caught in a storm over the very kind of information data scientists hoped to track.
The White House and other federal agencies have pressured the company to provide data on how anti-vaccine stories have spread online and have accused Facebook of withholding key information. President Joe Biden on Friday accused the company of “killing people” by allowing false information to circulate widely. On Monday, he backed down slightly, instead blaming the people behind the lies.
“Anyone who listens to it suffers from it,” Biden said. He said he hoped that instead of “taking it personally”, Facebook “would do something about disinformation.”
The company responded with statistics on how many messages containing disinformation it deleted, as well as how many Americans it has directed to factual information about the government’s response to the pandemic. In a blog post posted on Saturday, Facebook called on the Biden administration to stop “pointing fingers” and blaming Facebook after it missed its goal of vaccinating 70% of American adults by July 4. .
“Facebook is not the reason this goal was missed,” Guy Rosen, Facebook’s vice president of integrity, said in the post.
But the sharp back-and-forth struck a chord for the company: in fact, it doesn’t know many details about how misinformation about the coronavirus and the vaccines to fight it spread. This blind spot has reinforced the concerns of disinformation researchers about Facebook’s selective publication of data and the aggressiveness – or not – with which the company has investigated disinformation on its platform.
“The suggestion that we have not dedicated resources to tackling misinformation about COVID and supporting vaccine deployment is simply not supported by the facts,” said Dani Lever, a spokesperson for Facebook. “Without a standard definition of vaccine misinformation, and with both false and even true content (often shared by mainstream media) potentially discouraging vaccine acceptance, we focus on outcomes – measuring whether people who use Facebook accept COVID-19 vaccines. “
Facebook executives, including its CEO, Mark Zuckerberg, said the company was committed to removing misinformation about COVID-19 when the pandemic began. The company said it has removed more than 18 million misinformation about COVID-19 since the start of the pandemic.
Experts who study disinformation said the number of items Facebook deleted was not as informative as to how many items were uploaded to the site, or in which groups and pages people were seeing the spread of disinformation.
“They have to open the black box which is their architecture for ranking and amplifying content. Take this black box and open it for audit by independent researchers and the government, ”said Imran Ahmed, CEO of the Center for Countering Digital Hate, a nonprofit that aims to fight disinformation. “We don’t know how many Americans have been infected with disinformation.”
Social media companies: “Look at yourself in the mirror: think about this misinformation being passed on to your son, your daughter, your parent, someone you love: that’s all I’m asking. “
– Digital Hate Center (@CCDHate) July 19, 2021
Ahmed’s group, using publicly available data from CrowdTangle, a Facebook-owned program, discovered that 12 people were responsible for 65% of the COVID-19 misinformation on Facebook. The White House, including Biden, repeated that figure last week. Facebook says it disagrees with the characterization of the “disinformation dozen,” adding that some of their pages and accounts have been deleted, while others no longer post content that violates the rules. Facebook.
Renée DiResta, disinformation researcher at the Stanford Internet Observatory, called on Facebook to release more granular data, which would allow experts to understand how false vaccine claims were affecting specific communities within the country. The information, known as “prevalence data,” essentially examines the extent of a story, such as the percentage of people in a community on the service who see it.
“The reason more granular prevalence data is needed is that false claims do not spread equally among all audiences,” said DiResta. “In order to effectively counter the specific false claims that communities see, civil society organizations and researchers need a better idea of what is going on within these groups. “
Many Facebook employees have made the same point. Brian Boland, Facebook’s former vice president in charge of partnership strategy, told CNN on Sunday that he argued while at the company that she should share as much information as possible. Asked about the dispute with the White House over disinformation about COVID-19, he said: “Facebook has this data.”
“They are watching him,” Boland said. But he added, “Are they looking at him the right way? Are they investing in teams as fully as they should?
Boland’s comments have been widely repeated as proof that Facebook has the requested data but does not share it. He did not respond to a New York Times request for comment, but one of the data scientists who pushed inside Facebook for further study of coronavirus misinformation said the problem was more about whether and how the company studied the data.
Technically, the person said, the company has data on all the content that passes through its platforms. But measuring and tracking misinformation about COVID-19 requires first defining and labeling what qualifies as misinformation, which the person said the company has not committed resources to.