Who should decide what content is permissible online? Platforms will always exercise some degree of discretion over content moderation, and ensuring that platforms exercise their discretionary powers responsibly is a large part of making governance legitimate. In this chapter, we argue that improving the self-regulation of internal governance practices of platforms is a critical component of any regulatory project. Our argument is that platforms must always have a role in regulating lawful speech and that regulating ordinary, lawful speech is critical to influencing cultures and addressing harm. We draw on the results of a qualitative study involving a broad group of participants who actively work to influence how platforms govern their users. We offer a simple proof in the moral responsibilities that platforms bear to address the pressing need for cultural change in violence against women—responsibilities that cannot fully be carried out or overseen by states or other external actors.