While some people thrive in the land of TikTok dances, others struggle to limit their thoughts to 140 characters leading Twitter to increase their character limit to 280 in 2017. In fact, as of February 2019 Internet users believe social media platforms have increased access to information and the ease of communication by 57 percent.

Surely, there is no shortage of online content.

The proliferation of new applications and technology also has meant the dissemination of diverse information. To date, Section 230 of the Communications Decency Act of 1996 has largely sheltered platforms like Facebook, Twitter, and YouTube, that house users’ messages, from legal liability. Section 230 provides “interactive computer services” cannot be deemed the publisher of and held liable for third-party content. But a case currently pending before the United States Supreme Court, Reynaldo Gonzalez v. Google, may change that.

In 2015, Nohemi Gonzalez, an American college student studying in France, was killed by the international terrorist group, ISIS. The day after the attack ISIS released a written statement in a YouTube video claiming responsibility.

Eight years later, her family found themselves at the Supreme Court as the nine Justices wrestled with a lawsuit that challenged the intersection of technology companies and the policies formed generations ago that govern them.

The Gonzalez family alleged YouTube—owned by Google—is legally responsible for the Paris attacks through its algorithm, which provides user-feed recommendations in the form of video thumbnails. YouTube’s algorithm "recommended that users view inflammatory videos created by ISIS, videos which played a key role in recruiting fighters to join ISIS in its subjugation of a large area of the Middle East, and to commit terrorist acts in their home countries." Gonzalez’s counsel argued the site should be held responsible for the content of those videos as it acted as a recruiting platform for the terrorist group.

In contrast, Google argued terrorist content is banned on the platform and that it regularly enforced policies preventing terrorist from using its services. Counsel for Google also argued Section 230 generally immunizes technology and social media companies from liability stemming from what their users share on their platforms. Google contended such immunity is essential to tech companies’ ability to provide useful content to its users and customize user experience.

Does Section 230 provide a legal distinction between a company hosting user content, and amplifying user content through the algorithmic recommendations their websites may provide? That question is open to debate.

Four principles serve as justification for protecting speech: to further self-governance; to aid in the discovery of truth and maintain the marketplace of ideas; to promote autonomy and protect self-expression; and to encourage tolerance. These principles have real-world effects.

What is viewed as a strong pillar of the First Amendment, is the “marketplace of ideas.” Justice Oliver Wendell Holmes endorsed the idea that “the ultimate good desire is better reached by free trade of ideas” where the government cannot censor false or harmful speech because its judgment might be wrong, but allows citizens to weigh the ideas in a competitive market.

Justice Holmes’ vision appears to be based on a simple rationale: good ideas win and the better argument will be persuasive. Even if the First Amendment is viewed as an important symbol of our country, it is a highly labile one. Many criticize the effectiveness of using the marketplace of ideas because it often ignores that in order for the marketplace to work, citizens need interactions that develop a sense of trust and build on habits of cooperation.

Indeed, Oxford Dictionary selected “post-truth” as the 2016 word of the year defined as “denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotional belief.” A study that analyzed over 376 million Facebook users’ interactions, found that people tend to only seek information that aligns with their views. Perhaps what YouTube’s algorithm currently provides.

For over two hours the Court grappled with the idea of potentially exposing companies to a barrage of lawsuits related to their handling of user content. “We really don’t know about these things. You know, these are not like the nine greatest experts on the internet,” Justice Elena Kagan said of herself and her colleagues, several smiling in agreement. Justice Brett Kavanaugh, similarly noted: “Congress, not the Court, should make needed changes to a law passed early in the internet age” and questioned, “isn’t it better to put the burden on Congress to change that?”

We must appreciate Justice Holmes’ wisdom, but also acknowledge what is often not included is the perspective: the cultural, intellect, and political combat that is derived from free speech. The marketplace of ideas can be unpredictable and even impossible to be domesticated in certain situations. Gonzalez v. Google highlights how these situations bring important questions to the fore, including: Does the First Amendment continue to preserve the wholesale protection of speech, or is there is a point at which we are willing to limit avenues of speech, and interject with justifications for regulation?

Perhaps the regulation starts with Section 230 itself. However, the Court appeared cautious about how to sensibly narrow the protections afforded by Section 230. Among the Court’s concerns was how to limit sweeping consequences for internet users, and the onslaught of litigation that would surely result. "You are creating a world of lawsuits," Justice Kagan stated. "Really, anytime you have content, you also have these presentational and prioritization choices that can be subject to suit."

The Supreme Court is expected to release a decision on the case by end of June.

Back to Commercial Litigation Update Blog

Search This Blog

Blog Editors

Related Services

Topics

Archives

Jump to Page

Subscribe

Sign up to receive an email notification when new Commercial Litigation Update posts are published:

Privacy Preference Center

When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.

Strictly Necessary Cookies

These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.

Performance Cookies

These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.