Supreme Court to hear case on how the government talks to social media companies

Cath Virginia / The Verge | Photos via Getty ImagesOn Monday, the Supreme Court will hear a case that could upend how social media platforms deal with posts containing anything from vaccine misinformation to election threats. At the moment,...

Supreme Court to hear case on how the government talks to social media companies

On Monday, the Supreme Court will hear a case that could upend how social media platforms deal with posts containing anything from vaccine misinformation to election threats. 

At the moment, various arms of the US government will communicate directly with platforms for all sorts of reasons. For example, the Centers for Disease Control and Prevention (CDC) might email directly with someone at Facebook during a global pandemic, especially if Facebook wants to set up an information hub for its users. (You can imagine similar scenarios for voter misinformation, election integrity, and all kinds of public emergencies.) 

The core question at issue in Murthy v. Missouri is whether the government can flag potentially harmful posts to social media companies without it turning into unconstitutional coercion of speech. (Coercion in this vein is generally called “jawboning.”) 

These arguments come just weeks after the court heard another set of First Amendment challenges involving social media. In those cases, Moody v. NetChoice and NetChoice v. Paxton, the court considered whether state laws legislating how social media companies could moderate posts on their sites violated the platforms’ own First Amendment rights.

How Murthy v. Missouri reached the Supreme Court

The case at issue got started when Republican state attorneys general from Missouri and Louisiana decided to sue the Biden administration in May 2022, arguing that various government arms — including the CDC and the Cybersecurity and Infrastructure Security Agency (CISA) — violated the First Amendment while communicating with social media companies. The AGs claimed that the Biden administration effectively coerced the platforms to take down posts or accounts spreading what was identified as covid or voting misinformation.

If this legal argument sounds unsettlingly familiar, it may be because much of Murthy is echoed in Elon Musk’s extremely cursed crusade to make the “Twitter Files” a thing. The original Missouri state AG’s press release about the lawsuit references Hunter Biden’s laptop, the Wuhan lab leak theory, and the efficacy of masking.

The original state AG’s press release about the lawsuit references Hunter Biden’s laptop, the Wuhan lab leak theory, and the efficacy of masking

In July 2023, a federal district court enjoined parts of the Biden administration from communicating with social media platforms, as well as groups like the Stanford Internet Observatory and Election Integrity Partnership, which track the spread of misinformation. Later that year, the Fifth Circuit Court of Appeals upheld much of that broad preliminary injunction, though it narrowed some of its scope. Soon after, the Supreme Court lifted limits on the administration’s communications while considering the case.

The decision in Murthy v. Missouri will help determine the extent to which the Biden administration can notify social media platforms about potentially concerning content on their sites. That could affect the safeguards these companies have put up around misinformation, and may change the kind of flags and warnings you see on posts all over the internet. SCOTUS is likely to issue a decision around June, just months ahead of the November elections.

Coercion versus persuasion

There are two Supreme Court precedents likely to come up during the arguments Monday: Bantam Books v. Sullivan and Blum v. Yaretsky

Bantam Books is a case from 1963 involving a Rhode Island commission created to evaluate whether books were appropriate for minors. The court ruled that the commission effectively coerced book distributors to suppress certain works through intimidation.

The court “found that the intent of the commission was not to educate or inform the book distributors about how to comply with the law, but rather to intimidate them into suppressing and censoring content that the commission didn’t like,” according to Jennifer Jones, staff attorney at the Knight First Amendment Institute at Columbia University. The court recognized that while it’s constitutional for the authorities to advise or persuade intermediaries to act in a certain way, the government had gone too far in this instance. “When the acts become coercive, and when they basically apply this unrelenting pressure so that the intermediaries don’t publish speech because the government doesn’t like it, that does, in fact, violate the Constitution.”

Jones added that Murthy v. Missouri will be “the first time that the court is going to evaluate the application of that framework in the context of social media.” If the court agrees with the state AGs that the Biden administration inappropriately communicated with social media companies, Jones said, “that could really severely limit the ability of government officials to communicate with and work with the platforms moving forward.”

Social media platforms could be “chilled” from reaching out to the government to verify information

But it’s not just the government’s ability to reach out to platforms that’s at risk — communication is a two-way street, after all. Jones said that social media platforms themselves could be “chilled” from reaching out to the government to verify information (as they might be inclined to do during a public health emergency like the covid-19 pandemic, for example) for fear of being held liable if they’re seen as doing the government’s bidding.

That’s because Blum v. Yaretsky created “a standard for when the conduct of a private actor can actually be transformed into state action,” Jones said. “And therefore, that private actor can be held liable because they’re essentially acting at the behest of the government.”

The US Chamber of Commerce, a broad business group, warned the court in its brief against allowing the theory of state action to be used to punish private entities. “[E]ven if the Court views this case through the lens of state action, it should confirm that when the government interferes with private speech choices, the remedy lies in restraining the government — not in further abridging the rights of the coerced private parties with injunctions that limit the exercise of their own First Amendment rights,” the group wrote. 

Gautam Hans, associate director of the First Amendment Clinic at Cornell Law School, anticipates Murthy v. Missouri being a difficult case for the court to parse through, since the states identify a vast range of different actions by different parts of the government with which they take issue. 

Unlike Bantam Books, Hans said, where “it’s pretty clear what happened, and it was also one specific instance of government interference, here we have a whole plethora of actions that are not necessarily created equal.” He added that the Supreme Court might not be the right place to sort out some of these tricky and fact-specific questions.

Wide-ranging impact

Several interest groups wrote amicus briefs to the court warning that no matter how the justices rule, they should be careful not to craft a standard that harms these groups’ work.

For example, a bipartisan group of current and former election officials submitted a brief emphasizing the importance of allowing people in their roles to “remain free to communicate with social media platforms to share accurate information about when, how, and where to vote; to correct false election information; and to address violent threats and intimidation directed at their own ranks.” They also said it’s important that election officials and government agencies are able to respond when social media companies reach out for help in promoting accurate voting information and limiting the spread of false content. 

Members of the nonpartisan Election Protection coalition said they fear a ruling for the states would “endanger the right to vote as information sharing between and among civil society, government, and social media companies is essential to prevent malicious election interference and voter suppression efforts.”

The Reporters Committee for Freedom of the Press warned in a brief against creating a standard of coercion that is overly broad. 

“A too-sensitive test for coercion could have two negative consequences,” the committee wrote. “First, it could lead to the chilling of the free flow of information from government sources to the news media. Second, it could license plaintiffs to pursue burdensome fishing expeditions for what they believe to be evidence of collusion between journalists and public officials.”

Medical groups including the American Medical Association (AMA) asserted in a brief that the Biden administration has a “‘compelling interest’ in combatting vaccine misinformation.” That’s because, according to the AMA, “it is an indisputable scientific fact that vaccinations save lives.”

But the smaller and conservative-oriented Association of American Physicians and Surgeons (AAPS) wrote in a brief that accepting the American Medical Association’s assertion would “green-light government censorship” of people like Robert F. Kennedy, Jr., a presidential candidate and leading voice in the anti-vaccine movement.

“The same arguments made by the AMA Amici could be extended to other types of speech disfavored by the Biden Administration, such as criticism of transgender procedures and late-term abortion,” the AAPS wrote.

NetChoice and the Computer and Communications Industry Association (CCIA), the leading parties on the other social media First Amendment cases this term, joined with other industry groups to make a point about how their cases intersected with this one. While they didn’t take a position on the case itself, the groups wrote in a brief that there needs to be a “clear rule” to prevent governments from compelling platforms’ speech or preventing content moderation “by informal or indirect cajoling or coercion.” They also asked the court to clarify “that those digital services themselves are not state actors and may not be held liable for the government’s actions.”

Hans noted that many of the amicus briefs were filed in support of neither party, even from First Amendment groups that would typically take a stand in such a case. “I take that as a sign of the sort of theoretical messiness of the issues in this case,” he said.