The surgeon general wants Facebook to do more to stop Covid-19 lies

Dr. Vivek Murthy, United States Surgeon General, is taking a stand against health misinformation. | Samuel Corum/CNP/Bloomberg via Getty ImagesDr. Vivek Murthy considers social media misinformation to be a deadly public health threat. US Surgeon General Vivek Murthy says...

The surgeon general wants Facebook to do more to stop Covid-19 lies

US Surgeon General Vivek Murthy says that misinformation — much of it on tech platforms — is a public health threat that has cost people’s lives and prolonged the Covid-19 pandemic.

As Murthy said in a Thursday press conference, health advisories are usually about things people physically consume: food, drinks, cigarettes. But the first advisory of his tenure in the Biden administration (he was also the surgeon general under President Obama) is about what we consume with our eyes and ears: misinformation.

The advisory comes with a set of guidelines on how to “build a healthy information environment,” with recommendations for everyone from social media users up to the platforms themselves (also: health workers, researchers, and the media). Murthy also went on some of those very platforms to spread the message, including Twitter and Facebook.

“Today, we live in a world where misinformation poses an imminent and insidious threat to our nation’s health,” Murthy said in a press conference, adding that “modern technology companies” have allowed misinformation and disinformation to spread across their platforms “with little accountability.”

The advisory isn’t a set of orders that must be followed by these companies, but the increased scrutiny and attention does put pressure on them to more aggressively combat the falsehoods spreading on their platforms.

Sen. Josh Hawley (R-MO), a frequent Big Tech critic, has already pushed back against the advisory, accusing Facebook and Twitter of colluding with the Biden administration to censor speech. Press secretary Jen Psaki told reporters that the White House has been in contact with those platforms and flags problematic content to them, which Hawley interpreted to mean that the platforms have “functionally become arms of the federal government.”

The health advisory comes as Covid-19 vaccination rates in the United States are dropping, cases are picking back up, and the fast-spreading delta variant takes hold. The vast majority of Covid-related hospitalizations and deaths have been for people who aren’t vaccinated, despite the widespread availability of vaccines in the US. And with some people choosing not to get vaccinated because they believe misinformation about the vaccines, the Biden administration has reportedly decided it’s time to fight back.

Coronavirus misinformation doesn’t only appear on social media. But social media gives it a stage and reach that offline platforms don’t have, and this has been a concern for years. Mis- or disinformation potentially influenced the outcome of the 2016 presidential election, increased political polarization, contributed to the rise of the QAnon conspiracy theory, played a role in the ethnic cleansing of the Rohingya Muslims in Myanmar, and, now, has helped to prolong the pandemic.

As researcher Carl T. Bergstrom, co-author of “Stewardship of global collective behavior,” a paper that calls for more research into social media’s impact on society, told Recode’s Shirin Ghaffary, “social media in particular — as well as a broader range of internet technologies, including algorithmically driven search and click-based advertising — have changed the way that people get information and form opinions about the world. And they seem to have done so in a manner that makes people particularly vulnerable to the spread of misinformation and disinformation.”

For their part, social media platforms have made attempts to stop the spread of false information, including removing posts and videos and banning accounts that spread it, as well as appending fact-checks or links to trusted information on posts and videos that might be misleading. As it became more likely that there would soon be a Covid vaccine at the end of 2020, various platforms were proactive in preparing for the vaccine misinformation that would (and did) inevitably follow. This came after years of these companies doing very little to stop the spread of misinformation about other vaccines, and despite many warnings from experts about the potential harm to public health done by hosting anti-vaccine content and communities.

“We agree with the Surgeon General — tackling health misinformation takes a whole-of-society approach,” a Twitter spokesperson told Recode in a statement. “We’ll continue to take enforcement action on content that violates our COVID-19 misleading information policy and improve and expand our efforts to elevate credible, reliable health information — now, amid the COVID-19 pandemic — and as we collectively navigate the public health challenges to come.”

YouTube spokesperson Elena Hernandez told Recode that the platform “removes content in accordance with our COVID-19 misinformation policies, which we keep current based on guidance from local health authorities. We also demote borderline videos and prominently surface authoritative content for COVID-19-related search results, recommendations, and context panels.”

And Kevin McAlister, of Facebook, told Recode that the company has “partnered with government experts, health authorities, and researchers to take aggressive action against misinformation about COVID-19 and vaccines to protect public health,” removing millions of pieces of Covid-19 misinformation while trying to guide users to trusted sources about the virus and vaccines.

But many believe their efforts are too little, too late, and still don’t go far enough — including, it seems, the surgeon general.

“We expect more from our technology companies,” Murthy said.

Let’s see if we get it — and if, at this point, it will help.


Update, July 16, 11:45 am ET: Added Sen. Hawley’s statement and a comment from Facebook.