The 2015 murder of a young woman in Paris by ISIS terrorists is at the center of a U.S. Supreme Court case that challenges the liability protection social media companies enjoy for hosting content published by others.
The case, Gonzalez v. Google, revolves around slain college-exchange student Nohemi Gonzalez, 23, and whether Google, which owns YouTube, helped ISIS recruit by recommending certain content through YouTube’s algorithms. American law forbids aiding and abetting terrorists, but Google claims Section 230 of the Communications Decency Act protects it from legal responsibility for videos pushed by its recommendation algorithms.
Supreme Court justices on Tuesday struggled to grasp the issues raised in the case.
“We’re a court. We really don’t know about these things,” Justice Elena Kagan said. “You know, these are not like the nine greatest experts on the internet.”
Justice Brett Kavanaugh suggested, “Isn’t it better … to put the burden on Congress to change that, and they can consider the implications and make these predictive judgments?”
Justice Neil Gorsuch questioned Google’s claim that algorithms are neutral, and suggested the case should be sent back to the U.S. Court of Appeals for the 9th Circuit, which had ruled for Google.
Law professor Eric Schnapper, who represents the Gonzalez family, argued that Section 230 of the Communications Decency Act makes a distinction between claims holding internet companies liable for content created by others and claims that hold internet companies responsible for their own actions.
“Helping users find the proverbial needle in the haystack is an existential necessity on the internet,” countered Lisa S. Blatt, representing Google. “Search engines thus tailor what users see based on what’s known about users. So does Amazon, Tripadvisor, Wikipedia, Yelp!, Zillow, and countless video, music, news, job-finding, social media, and dating websites.”
Blatt responded to Justice Amy Coney Barrett, who asked if platforms such as YouTube would be protected if their sorting mechanism was not neutral but “really defamatory or pro-ISIS.” Blatt argued that Section 230 would protect the platforms.
Schnapper said that items Blatt claimed were “inherent” in publishing, like topic headings and “trending now” tags, were in fact not inherent.
YouTube describes itself as a public forum, which, according to Section 230, is not subject to liability for content placed on its site. Section 230 also requires public forums to promote “a true diversity of political discourse.”
In mid-October 2020, FCC Chairman Ajit Pai released a statement showing his intention to clarify what Section 230 should mean.
“The U.S. Department of Commerce has petitioned the Commission to ‘clarify ambiguities in section 230.’ … As elected officials consider whether to change the law, the question remains: What does Section 230 currently mean?” Pat wrote. “Many advance an overly broad interpretation that in some cases shields social media companies from consumer protection laws in a way that has no basis in the text of Section 230. … Social media companies have a First Amendment right to free speech. But they do not have a First Amendment right to a special immunity denied to other media outlets, such as newspapers and broadcasters.”