Social Science Research Council Research AMP Just Tech
Citation

Child Safety on Federated Social Media

Author:
Thiel, David; DiResta, Renée
Year:
2023

Since becoming popular in the mid-2000s, the largest social media companies have operated largely in a centralized way: the entire network is controlled by a single central authority, and all user data is stored and managed on their servers. This is how Facebook, Twitter, and YouTube, for example, operate. However, due to user dissatisfaction and shifting social norms, federated social media—a decentralized approach to social networking in which multiple interconnected servers, called instances, are owned and operated independently by different organizations or individuals—has recently experienced a surge in popularity. Projects such as Mastodon, Bluesky, Pleroma and Lemmy offer new possibilities for a more resilient, protocol-based social media ecosystem not bound to a single company or entity. Users can create accounts on any instance they choose, and they have the freedom to follow or interact with users on other instances within the federation. While decentralization can help distribute load among multiple entities, decentralized platforms offer new challenges for trust and safety. There is no central moderation team, for example, tasked with removing imagery of violence or selfharm, child abuse, hate speech, terrorist propaganda or disinformation. Each instance in a federated social media network may have its own set of rules and policies, with administrators moderating content and enforcing guidelines specific to their instance—this is considered a strong selling point, as users can find an instance that aligns with their values and tolerance levels for particular types of content. At a time when the intersection of moderation and free speech is a fraught topic, decentralized social networks have gained significant attention and many millions of new users.

However, significant harm categories and child safety in particular can become a very serious problem because of the regulatory arbitrage of content moderation: bad actors tend to go to the platform with the most lax moderation and enforcement policies. This means that decentralized networks, in which some instances have limited resources or choose not to act, may struggle with detecting or mitigating Child Sexual Abuse Material (CSAM). Federation currently results in redundancies and inefficiencies that make it difficult to stem CSAM, NonConsensual Intimate Imagery (NCII) and other noxious and illegal content and behavior. In this paper we take a broad look at child sexual exploitation concerns on decentralized social media, present new findings on the nature and prevalence of child safety issues on the Fediverse, and offer several proposals to improve the ecosystem in a sustainable manner. We focus primarily on the Fediverse (i.e., the ecosystem supporting the ActivityPub1 protocol) and Mastodon, but several techniques could also be repurposed on decentralized networks such as Nostr, or semi-centralized networks such as Bluesky.