White House argues platforms should be liable if algorithms promote harmful content
There are some cases in which companies such as Google should be liable for content on their platforms, the Biden administration argued in a brief submitted Wednesday in a Supreme Court case that could rewrite the rules around how social media giants battle disinformation online.
The administration’s brief submitted on behalf of the plaintiffs in the Gonzalez v. Google case argues that while the broad legal immunity Section 230 of the 1996 Communications Decency Act that protects Google, and its subsidiary YouTube, from liability for most content uploaded by users, that shield does not extend to how its algorithms promote harmful content. The Supreme Court is expected to hear the case this term.
“The effect of YouTube’s algorithms is still to communicate a message from YouTube that is distinct from the messages conveyed by the videos themselves,” the filing said. “Even if YouTube plays no role in the videos’ creation or development, it remains potentially liable for its own conduct and its own communications.”
The legal brief follows longstanding calls by President Biden to pare down legal protections large platforms such as Facebook and Google have long relied on to shield themselves from lawsuits seeking damages for their users’ false, hateful and violent speech.
The outcome of the Gonzalez could have major implications for how social media platforms and search engines present content to users.
Online platforms rely on recommendation algorithms like those at the center of the Gonzalez case to decide what information to present to users from the vast stores available online. If the justices rule that Google was responsible for recommending harmful, terrorist content, other platforms may be exposed to lawsuits for their role in spreading such material.
Asked for comment on the administration’s brief, a Google spokesperson said that YouTube has invested for years in technology, teams and policies to identify and remove extremist content.
“We regularly work with law enforcement, other platforms, and civil society to share intelligence and best practices,” spokesperson José Castañeda said via email. “Undercutting Section 230 would make it harder, not easier, to combat harmful content — making the internet less safe and less helpful for all of us.”
The family of a woman named Nohemi Gonzalez, who died in the 2015 Islamic State attacks in Paris, brought the case against Google. They sued Google on the grounds that its YouTube service recommended ISIS propaganda, spurring a terrorist attack that killed Gonzalez and 129 others.
The Supreme Court announced it would hear Gonzalez v. Google after a divided California appeals court ruled that Section 230 protects social media companies’ recommendations. The majority opinion noted that the law “shelters more activity than Congress envisioned it would.”
The Gonzalez family then petitioned the Supreme Court to hear the case. According to the family’s complaint, YouTube provides “a unique and powerful tool of communication that enables ISIS to achieve [its] goals.” Two of the ISIS jihadists who carried out the Paris massacres used online social media platforms to post links to YouTube-hosted ISIS recruitment YouTube videos, according to the Gonzalez family’s suit.
Because YouTube relies on algorithms that feed users’ content based upon their viewing history, the Gonzalez family has asserted that YouTube helped facilitate jihad networking by recommending ISIS videos.