Content Moderation in an Age of Extremes

Tushnet, Rebecca
Journal of Law, Technology, and the Internet

While some people argue for increased government intervention in the form of legal duties to remove various types of unwanted content, others maintain that the best solution is to reconstruct some sort of democratic process within a service’s “polity” itself, a procedural solution to knotty problems of substance. I want to complicate the debate by discussing the multiple types of actors in the intermediary space; some entities, like the OTW, don’t resemble the profit-seeking model at which most regulatory and governance proposals are directed. Other online entities, such as those that participate in the domain name system, have very different functions and abilities than the websites and apps most people think of as “the internet.” If we don’t keep these variances in mind, we are unlikely to get the results we seek. It’s very hard to generalize beyond those cautions because things are changing so fast in terms of both content moderation policies and government action (such as the recent preliminary approval of a copyright filtering requirement for intermediaries in Europe,12 and the even more recent “embedding” of French officials into Facebook to see how it regulates hate speech13).