The Supreme Court is about to decide the future of online speech
Photo by Kevin Dietsch/Getty ImagesThe ruling could impact other publishing industries and future social media regulation. Continue reading…
Social media companies have long made their own rules about the content they allow on their sites. But a pair of cases set to be argued before the Supreme Court on Monday will test the limits of that freedom, examining whether they can be legally required to host users’ speech.
The cases, Moody v. NetChoice and NetChoice v. Paxton, deal with the constitutionality of laws created in Florida and Texas, respectively. Though there are some differences between the two laws, both essentially limit the ability of large online platforms to curate or ban content on their sites, seeking to fight what lawmakers claim are rules that suppress conservative speech. This fight has reached the Supreme Court level in part because an appeals court in Florida declared that state’s version of the law unconstitutional, while a separate appeals court allowed the Texas law to stand, creating a legal rift.
The laws’ opponents warn that a ruling for the states could force social media companies to carry “lawful but awful” speech like Nazi rhetoric or medical misinformation, which would likely repel a wide swath of users. Rather than offend users, critics argue, platforms may choose to block whole categories of discussion — around topics like race — to avoid legal blowback.
It’s not just big social media platforms that are concerned about the effects of the laws. The nonprofit that runs Wikipedia and individual Reddit moderators have worried that they might need to fundamentally change how they operate or face new legal threats. More traditional publishers have warned that a ruling in the states’ favor could undercut their First Amendment rights as well.
But even some opponents of the laws fear that a broad ruling for NetChoice could hobble any future attempts to regulate a powerful industry.
“These cases are about the future of public discourse online,” says Scott Wilkens, senior counsel at the Knight First Amendment Institute at Columbia University, “and the extent to which that public discourse serves democracy.”
What to know about the cases
Texas’ HB 20 and Florida’s SB 7072 were both passed in 2021, months after former President Donald Trump’s ouster from social media platforms like Twitter following the insurrection at the US Capitol on January 6th. Tech industry groups NetChoice and the Computer & Communications Industry Association (CCIA) sued to block both laws, resulting in two very different rulings: the Eleventh Circuit Court of Appeals ruled in its favor on the Florida statute, while the Fifth Circuit Court of Appeals reached the opposite conclusion with the Texas law, leading the parties to petition the Supreme Court for a resolution. The Supreme Court agreed to consider two aspects of the social media laws: their so-called must-carry provisions and parts of their transparency requirements.
Must-carry provisions are the requirements that platforms host speech even when they don’t want to. NetChoice has argued this requirement unlawfully compels speech by the platforms, like forcing a newspaper to run an op-ed, while the states claim they’re merely regulating conduct of a public forum within state purview. In addition to these requirements, the laws order platforms to explain why they remove or reduce the visibility of posts on their sites, a transparency standard that the industry believes will be overly burdensome.
The arguments around each law may be slightly different beyond that. Florida’s statute includes quirks like special protection for political candidates and journalistic enterprises, while Texas simply grants broad protection based on “viewpoint.” The transparency standards are also different: Florida demands that social media companies provide a “thorough rationale” for why it chooses to remove or “shadow ban” a post, while Texas’ more simply requires platforms provide a reason when they take down posts entirely.
Are platforms like newspapers?
A key focus of debate will likely revolve around the appropriate metaphor for tech platforms — including whether their moderation standards can be compared to other media like newspapers.
NetChoice is leaning heavily on a 1974 case called Miami Herald Publishing Co. v. Tornillo, where the Supreme Court held that a newspaper could not be forced to print a reply to its article. NetChoice argues that a social network choosing to ban certain content is similar to a newspaper exercising editorial judgment, and compelling either to host speech they abhor would violate the First Amendment. “There are some obvious differences between newspapers and online websites, but ultimately they are engaged in the same type of First Amendment-protected activity,” NetChoice litigation center director Chris Marchese says in an interview with The Verge.
Numerous legal experts have agreed with this claim to a point. Wilkens, for instance, believes the Supreme Court should strike down the rules requiring platforms host content they don’t want to. “The must-carry provisions are unconstitutional because they override the platforms’ exercise of editorial discretion and cannot survive even intermediate scrutiny,” the Knight Institute wrote in a brief signed by Wilkens. “These provisions force platforms to publish a vast array of speech they do not want to publish, and that they view as inconsistent with the expressive communities they are trying to foster.”
But Wilkens and others are wary of a ruling that grants everything NetChoice wants. The Supreme Court should “not construe the First Amendment rights of the platform so broadly that it would prevent governments from enacting carefully-drawn laws” about things like transparency and interoperability, he says. Better-written laws could still advance “First Amendment values,” he adds.
“While I haven’t seen the law that touches the content moderation piece of it that I think is constitutional, I’m also not willing to foreclose that door forever.”
He also distinguishes between how each state plans to require transparency. In the Knight Institute amicus brief, he argues Florida’s “individualized-explanation provision” — which also lets individuals seek substantial damages — should be found unconstitutional while Texas’ should be upheld, because Texas’ disclosure requirements seem “far less onerous” and could likely be automated by the platforms.
And there are, in fact, obvious differences between huge sites like Facebook and a newspaper. “One factor the court may focus on is the fact that while newspapers closely curate all of the content that they publish, platforms do not have that kind of close curation of the enormous number of user posts that appear on the platform,” says Wilkens.
Gautam Hans, associate director of the First Amendment Clinic at Cornell Law School, says the states could also argue that the sheer volume of speech that platforms have to deal with precludes the “coherent editorial perspective” a newspaper might have.
“While I haven’t seen the law that touches the content moderation piece of it that I think is constitutional, I’m also not willing to foreclose that door forever,” Hans says. “Not because I think that state management of content is good, but I am sympathetic to the states’ points that this is a hugely important economic sector, and to largely insulate it from any kind of government regulation — particularly on the transparency side — I think would be imprudent just given the scale and scope of the problems.”
Is Facebook like a custom wedding website?
NetChoice will likely invoke other cases where the court ruled that various forums could not be compelled to carry speech. Last term, for instance, the Supreme Court ruled in 303 Creative v. Elenis, where a Colorado website designer feared a state anti-discrimination law would compel her to make wedding websites for gay couples against her beliefs. The court determined that such an interpretation would violate the First Amendment — which NetChoice sees as good news for it.
“The conservatives on the court can’t simultaneously uphold 303 Creative, which they did last term, and not side with NetChoice,” says NetChoice vice president and general counsel Carl Szabo.
Another case that could come up is Hurley v. Irish-American Gay, Lesbian, and Bisexual Group of Boston, where the court held in the mid-‘90s that organizers of a St. Patrick’s Day parade were not obligated to let the gay, lesbian, and bisexual group march in the event.
The states, meanwhile, will likely point to cases where it was constitutional to require private institutions to facilitate free expression. The 1980 case Pruneyard Shopping Center v. Robins held that a California shopping center could not bar students from soliciting petition signatures on its property. In a different 2006 case, Rumsfeld v. Forum for Academic and Institutional Rights, the court let Congress tie federal education funding to colleges allowing military recruiters to reach students on campus, even if those schools opposed the military’s sexual orientation policies.
“The conservatives on the court can’t simultaneously uphold 303 Creative ... and not side with NetChoice.”
On the question of the transparency requirements, expect to hear about Zauderer v. Office of Disciplinary Counsel, Supreme Court of Ohio, which found the state could compel some commercial speech for disclosures to consumers. But it laid out a standard that’s supposed to apply to uncontroversial disclosures — so its application here may be slippery.
The standard “doesn’t really get you very far because controversy’s obviously a) in the eye of the beholder, but b) very easy to manufacture,” Hans says. “It doesn’t give you a lot of power one way or the other to say, ‘Oh, this is clearly within Zauderer or this clearly isn’t.’” Hans says the justices might consider whether the standard should be updated or abandoned as part of their analysis.
The Supreme Court right now is full of mixed signals
In the last major tech case the Supreme Court decided, Gonzalez v. Google, the justices declined to address major questions around Section 230, a foundational internet law. The relatively light-touch approach came after arguments where the justices acknowledged their lack of expertise in the tech field — “these are not like the nine greatest experts on the internet,” Justice Elena Kagan quipped at the time.
The justices will be on more familiar ground dealing with the First Amendment, experts interviewed for this article say. Still, exactly how they interpret the case could be a surprise. Even in the decision to grant an emergency order blocking Texas’ law, the liberal Kagan dissented alongside conservatives Samuel Alito, Clarence Thomas, and Neil Gorsuch.
Thomas in particular has written about social media companies in a way that could seem ominous for tech platforms. In 2021, he mused about whether they should be considered common carriers that can be more heavily regulated. But NetChoice’s Marchese, who praised Thomas and says he had him as a professor in law school, says he wasn’t worried. “I think Justice Thomas was very honest in his concurring and dissenting opinions where he says, ‘Look, I’m curious about these alternative arguments — the states say that they’re common carriers, give me some evidence of that,’” Marchese says. “So I think his statements shouldn’t be taken as a conclusive matter of law, but more as curiosity.”
The stakes are higher than just social media
While most of the discussion around these cases has focused on big tech platforms like commercial social networks, a decision against them could be applied to everyone from traditional media outlets to individual website moderators, too.
Several publishing industry groups, including the Reporters Committee for Freedom of the Press, American Booksellers for Free Expression, and Motion Picture Association, signed a brief opposing the state laws. “Upholding Texas and Florida’s intrusion on editorial autonomy would undermine the rights of publishers of all kinds,” the brief argues. It would come at a time when lawmakers in Florida and other states have pushed rules suppressing books and other speech about topics like race or gender identity, some of which have also been so far blocked by courts.
“What the Supreme Court says in these cases could have an enormous impact on state and federal legislation.”
The Wikimedia Foundation, which operates Wikipedia, has also opposed the law. In its brief, it speculated that its massive online encyclopedia could be whittled down to the least controversial topics to avoid excessive legal burden. “Rather than be forced to disseminate obviously false information or to provide a thorough rationale each time a Wikipedia article is edited, the Foundation and its users may decide that the safer course is to avoid certain topics altogether — thus resulting in an ‘encyclopedia’ that omits mention of critical social and political issues of the day,” the group wrote.
The moderators of two subreddits even warned that Texas’ law might be interpreted to allow the attorney general to file suit against them as individuals — not just Reddit — if they made a moderation call a user didn’t like.
And beyond the Texas and Florida bills’ immediate consequences, the Supreme Court’s decision will help determine states’ power to regulate online platforms at all. Open Markets Institute, a nonprofit that advocates for robust competition policies, took “no position on the wisdom” of the state laws or the First Amendment analysis. But it wrote in a brief that the Court should honor states’ ability to regulate platforms “as common carriers if and when they determine it is appropriate.”
“What the Supreme Court says in these cases could have an enormous impact on state and federal legislation going forward that tries to regulate social media platforms’ content moderation,” Wilkens says, pointing to efforts to regulate kids’ online safety across states and in Congress. “Those kinds of statutes could very much be influenced by what the Supreme Court says in these NetChoice cases. The Texas and Florida statutes are the first statutes in the nation that attempt to regulate social media platforms’ content moderation.”
“No matter what happens,” Hans says, “this is not going to be the end of the conversation in the courts and the states.”