Yesterday, Authors Alliance joined a diverse group of creators of online content on an amicus brief in Gonzalez v. Google, a case before the Supreme Court. The case is about Section 230 of the Communications Decency Act and whether it protects curated recommendations by platforms. Section 230 protects online service providers from legal liability for content generated by users, and is considered by many to be essential for a vibrant and diverse internet. By shielding platforms from liability for speech their users make on these platforms, Section 230 enables the free flow of ideas and expression online, including speech on controversial topics. This is consistent with First Amendment values and the functioning of the internet as we know it.
The case concerns ISIS recruitment videos posted on YouTube, which the petitioner alleges were recommended by the platform. Gonzalez argues that Section 230 should not shield Google from liability, and that it aided in ISIS recruitment by recommending these videos to users. Google, on the other hand, contends that Section 230 shields it from liability for recommendations made on the platform, including the recommendations at issue in the case.
Our brief makes three principal arguments. First, it argues that Congress intended Section 230 to foster a free Internet where diverse and independent expression thrives. We explain that 230 was meant to facilitate free expression online, which is precisely what it continues to do.
Second, our brief argues that platform recommendations contribute to the flourishing of free expression, creativity, and innovation online. Authors like our members are served by platform recommendations and curation: for authors whose works may not appeal to a general audience, platform recommendations enable readers interested in a particular topic or type of work to discover them. In this way, platform recommendation can serve authors’ interests in seeing their works reach broad and diverse audiences. This is particularly important for authors just starting out in their careers who have not yet found an audience, and platform recommendations can and do help these authors grow their audiences.
Finally, we argue that altering Section 230’s protections for recommendations could have dire consequences for current and future creators—including authors— and could chill the free flow of ideas online. If platforms were to be held liable for content created by users, we believe they would be inclined to take a more conservative approach, moderating content to avoid the threat of a lawsuit or other legal action. This could reasonably lead platforms to avoid hosting content on controversial topics or content by new and emerging creators whose views are unknown. An author’s ability to write freely, including on controversial topics, is essential for a vibrant democratic discourse. And if platforms were reluctant to recommend content by new creators, who may be seen as less “safe,” dominant and established creators could be entrenched, doing a disservice to less established creators. Were platforms to censor certain writings or ideas to avoid lawsuits, the internet would become less free, less vibrant, and more sanitized—doing a disservice to all of us.
Authors Alliance thanks Keker, Van Nest & Peters LLP for their invaluable support and contributions to this brief, as well as our fellow amici for sharing their stories.