Author granted license

Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International

Document Type

Article

Publication Date

2-2018

Publisher

Yale Law School Information Society Project

Language

en-US

Abstract

On February 13, 2018, WIII hosted the workshop, “Beyond Intermediary Liability: The Future of Information Platforms.” Leading experts from industry, civil society, and academia convened at Yale Law School for a series of non-public, guided discussions. The roundtable of experts considered pressing questions related to intermediary liability and the rights, roles, and responsibilities of information platforms in society. Based on conversations from the workshop, WIII published a free, publicly available report detailing the most critical issues necessary for understanding the role of information platforms, such as Facebook and Google, in law and society today. The report highlights insights and questions raised by experts during the event, providing an insider’s view of the top issues that influential thinkers on intermediary liability are considering in law, policy, and ethics. (Nothing in the report necessarily reflects the individual opinions of participants or their affiliated institutions.)

Key takeaways from this report include the following: Common Misconceptions on Intermediary Liability

Consumers and policymakers often (incorrectly) assume it is easy to determine what content to take down and how to do so efficiently. In reality, these decisions are very difficult and require many levels of human (not AI) review. There is no legal requirement for information intermediaries to be “neutral,” but policymakers and the public often assume this incorrectly.

Intermediaries and Global Norms

Information intermediaries play a vital role in protecting free speech, free expression, and access to knowledge globally. This is especially crucial for minorities and political dissidents living under authoritarian regimes. It is difficult, and at times impossible, for information intermediaries to comply with conflicting laws from different countries. This can be a barrier to innovation, disproportionately affecting smaller companies and startups. Policymakers should consider the impact that proposed regulations in one jurisdiction may have on people in the rest of the world. Regulations in democratic countries that restrict free online speech or that mandate content takedowns may provide support for illiberal regimes to call for greater censorship of online content.

Legal and Policy Proposals

Information intermediaries are no longer the companies they were when intermediary liability laws first developed, and the role of platforms in society is changing. The law must find a way to flexibly address these changes. A hybrid model of governance, with a larger role for lawmakers and an opportunity for judicial review and a right of reply in content takedown decisions, might better address the competing issues raised in speech regulation. Creating a transparency safe harbor would allow companies to provide more information to the public about their reasons for removing content. Policymakers could consider enacting different levels of regulations for different types of information intermediaries (infrastructure vs. content platforms, small companies vs. large companies, and so on).

Find on SSRN

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.