Evangelical, Jewish groups use differing legal tactics in social-media case
Supreme Court considering Florida, Texas laws backed by conservatives
Supreme Court briefs filed in a pair of cases that are likely to result in a landmark decision offer a glimpse at differing perspectives that advocates for two religious groups hold about social media.
The cases are Moody v. NetChoice1 and NetChoice v. Paxton, which arose from laws passed in 2021 by legislatures in Florida and Texas, respectively, that were concerned about “censorship” and other alleged mistreatment of conservative voices in social media.
Still More to Say is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
To put it simply, the amicus curiae or friend-of-the-court briefs suggest that evangelical Christians tend to be distrustful of the big social-media companies, while Jewish interests tend to see responsible social media as their potential allies. And even though it is governments that approved the legislation, the evangelical-backed brief also sees governments as the potential enemy.
The Florida law, known as the Stop Social Media Censorship Act or Senate Bill 7072, arose from concerns over some social media, such as Twitter (now X), shutting down Donald Trump’s accounts as well as the alleged suppression of “news” stories that that included unverified conspiracy theories and other problematic material. The law imposes hefty fines (up to $250,000 per day) for social-media platforms that deactivate or close accounts of political candidates or act against “journalistic enterprises.”
The Texas law, Senate Bill 20, takes a broader approach. It would prevent large social-media companies (those with at least 50 million active U.S. users) from blocking or demonetizing content based on viewpoint. It would also require social media to disclose their procedures for curating content as well as their algorithms for displaying content.
The high court agreed in September to review the cases. Oral arguments have yet to be scheduled, although parties in both cases have until mid-January to file their written arguments.
The organizations with religious interests that filed the amicus briefs are The Becket Fund for Religious Liberty, which often litigates on behalf of conservative Christians but also has supported numerous minority religions such as Native Americans and Sikhs; the American Center for Law and Justice (ACLJ), known for supporting numerous religious-right and Christian nationalist causes; the American Jewish Committee; and the Anti-Defamation League, a well-known pro-Jewish organization.
Becket’s brief took no stand on the central issue in the case, whether the two states’ laws infringe on the First Amendment. It merely asked to court to take into account that there are constitutional differences between the legal treatment of speech in general and religious speech, the latter of which has protections that other speech does not.
On the other hand, three of the briefs took clashing approaches to the states’ legislation: The ACLJ was sympathetic to the purposes of the two laws but also warned of what might happen if NetChoice were to win. The two Jewish groups, however, pointed out how social media can be an important tool in combating antisemitism, and warned that upholding the laws could hinder the companies from performing that role.
Here’s a more detailed look at the two sets of arguments:
ACLJ wary of both social media and government regulation
Although the ACLJ has been ideologically allied with the forces that advocated for the laws in Florida and Texas, its amicus brief does not call on the high court to decide either for or against the social media nor the state laws. Instead, the brief cautions about the dangers of what it calls the “tech titans” as well as of the possibility of other types of government control.
“This case presents enormous risks on both sides,” the brief warns. While the ACLJ recognizes the social media’s argument that the government under the First Amendment does not have the right to interfere with their editorial discretion, a victory for them could give them the ability to “go wild” with promoting their views:
In other words, the censorship of unwelcome viewpoints from social media platforms would likely become much more aggressive and much more frequent. That censorship would aim, under current conditions, to suppress morally traditional, culturally or politically conservative, historically Christian, pro-life, or other viewpoints that disrupt or depart from the regnant narrative. (And, of course, should a cultural and political shift in the echelons of power come about, the censorship could operate in the opposite direction.)
But that is not the only danger, the ACLJ warns. If NetChoice wins its argument, then it is possible that major online retailers such as eBay and Amazon might discriminate against sales of items such as pro-life T-shirts and pro-family swag.
But if the government wins its argument, the ACLJ writes, it might also act in ways that cause the social-media companies to discriminate against messages the government doesn’t want people to read.
The ACLJ’s brief also asks the court, if it approves the regulation of social media, to recognize the right of mission-oriented media to be exempt from those regulations. For example, an entity should be allowed to operate a platform that is billed as pro-environment or pro-Christian without being expected to balance the content in another direction.
Jewish groups applaud social media’s effort to combat antisemitism
Although not all social media take active efforts to prevent or remove content that promotes hate, those that do so should be allowed to continue doing so, both the American Jewish Committee and the Anti-Defamation League argue in their briefs. The two groups present similar arguments, pointing out the harm that comes from hate speech.
Said the Committee’s brief:
Recognizing the relationship between online hate and offline violence, many online services have chosen to moderate third-party content to mitigate this danger. Different services have taken different approaches to this problem and with varying degrees of success. But, amicus submits, it is essential that they retain the freedom to do so — and to respond quickly, decisively, and efficiently, before violent online rhetoric materializes in the real world. The state laws under review would impede that freedom, with potentially grave real-world consequences.
That brief gave three examples of the “vicious cycle” of online hate leading to hateful action:
The October 2018 Tree of Life Synagogue mass shooting during Saturday morning services, the deadliest attack on the Jewish community in U.S. history. Before the murders, the shooter had posted numerous hateful and antisemitic messages on the social medium Gab, many of them repostings from other social media.
The March 2019 massacre at mosques in Christchurch, New Zealand. The shooter live-streamed the killings on Facebook, and the video was posted to other social media.
The May 2023 shooting of black patrons at a supermarket in Buffalo, N.Y. The shooter live-streamed his shooting on Twitch, and evidence suggested that he was motivated at least in part by Internet-based media. (Twitch shut down the video shortly after it started, but by then it was already circulating on other media.)
The Committee called content moderation an “important tool for minimizing the spread of hateful messages about faith and ethnic groups.”
“Responsible social media services must retain the freedom to quickly and efficiently respond when their services are used as tools to propagate hate and violence in the real world,” the Committee also said, and it argued that the Florida and Texas laws could impede those efforts.
The League’s brief similarly cautioned about the risks involved in preventing social-media from moderating their content:
There is a well established and growing body of research linking online hate to offline violence, both at the individual and group levels. Disabling social media platforms from combating and containing online hate and harassment is certain to increase the amount of offline violence perpetrated against members of minority and marginalized communities. Even hate that stays online can cause harm: every day, Americans face exclusion from online spaces merely for existing as ethnic, religious, or other minorities, deeply chilling their participation in public discourse and depriving them of full citizenship.
NetChoice is an organization made up of about 35 large social-media and online retail and service companies. Among its members are Meta, TikTok, Amazon, Google, Airbnb, AOL, Nextdoor, Travelocity and eBay.