In what could be a seminal case of the Internet age, the U.S. Supreme Court this week heard arguments in Gonzalez v. Google, its first case concerning the hotly debated Section 230 of the Communications Decency Act. The case’s potential ramifications might be gleaned from the 70-plus amicus briefs filed by major companies, states, elected officials, and organizations.
Section 230 provides immunity to Internet platforms from liability arising out of third-party content posted to the platform’s websites. The statute prevents a “provider or user of an interactive computer service” from being treated as “the publisher or speaker of any information provided by another information content provider.” In this case, the Gonzalez family sued YouTube for making targeted recommendations of recruitment videos created by the terrorist organization ISIS. The Gonzalez’s daughter died in an ISIS terrorist attack, and they claim that Section 230 should not shield YouTube from civil liability when its algorithms recommended harmful content such as these videos.
YouTube’s parent company, Google, countered that almost every major Internet platform uses algorithms to provide content to users, for a variety of reasons, and that limiting Section 230 to exclude protection where algorithms are used would stifle the Internet as it’s known today.
Although oral argument at the Supreme Court cannot predict the outcome of a case, it provides insight into what issues concern the justices. With their questions, the justices grappled with whether the use of algorithms by Internet platforms created content. They spent much of the argument trying to distinguish the difference in outcomes, if any existed, between Internet platforms merely organizing and prioritizing content versus taking actively recommending and pushing content to users.
Justice Clarence Thomas asked if and how a “neutral” algorithm recommending content could be seen as aiding and abetting unlawful conduct. The same algorithms that push extreme terrorist content could also push rice pilaf recipes or cat videos, to name some of the examples posed at argument.
Justice Sonia Sotomayor queried whether the platforms could be held liable for writing algorithms that inadvertently discriminate against a class of individuals. Google’s counsel responded that such an interpretation would permit a deluge of negligence and product liability claims, with any user able to bring a lawsuit if they were unhappy with how platforms presented content to them.
A ruling that curtails Section 230’s immunity could change the kind of content that users see. Google warned that if the Court adopted the Gonzalez’s reading of Section 230, platforms might be forced to aggressively censor third-party content, including advertisements, making sure only the most benign content is allowed to appear on their sites.
The justices also openly worried about the practical consequences of opening platforms to suit based on third-party content. Chief Justice John G. Roberts noted that enormous amounts of content are fed to users based on individualized metrics. Justice Brett M. Kavanaugh also noted such a change could lead to a flood of lawsuits and nodded to the concern that such a change could “crash the digital economy.”
Justice Samuel A. Alito and Justice Ketanji Brown Jackson appeared amenable to reviewing how the lower courts have interpreted Section 230. Alito suggested that any content posted online that is not displayed randomly by platforms could be considered “publishing” and therefore outside of Section 230’s protection. Jackson questioned Google’s counsel about whether Congress had today’s algorithms in mind at all when it crafted those protections.
The Solicitor General’s Office appeared to take a middle-ground view, seeking to preserve the distinction between a platform’s decision about how to distribute third-party content and the content itself. The government did not support a liability exception for when a platform makes a recommendation without the user requesting it, but did support exemption if a platform somehow incorporated its own comment or endorsement of that third-party content.
In response to Kavanaugh’s concern that such an exemption would capture almost all algorithmic actions, the government noted that if a state legislature enacted legislation that prohibited Internet platforms from prioritizing companies that advertised with them, there was an open question of whether that violated the Commerce Clause or First Amendment. The government noted that it preferred courts to look at the elements of the cause of action rather than whether immunity was proper under Section 230.
In a lighter moment during arguments, Justice Elena Kagan dryly noted that the Court was not composed of the “nine greatest experts on the Internet” and asked why they should not leave it to Congress to amend Section 230.
The Court is expected to release a decision this summer. How it decides the Section 230 question may shape how content is shared on the Internet. Once the Court issues its decision, and if it decides to hear whether to review the constitutionality of Texas and Florida’s censorship laws, the Internet landscape faces potentially significant changes in the near future.