Dominant social media platforms’ content moderation practices operate highly unequally, disproportionately censoring marginalised users, while inadequately protecting them against hate speech and harassment. The EU’s main response to such issues has been the 2022 Digital Services Act (DSA), which aims to empower individuals to understand and contest moderation decisions. Analysing the DSA from a feminist perspective, I describe this approach in terms of ‘procedural fetishism’ and develop a three-level critique. First, available evidence as to how similar systems work in practice suggests these provisions may have relatively little practical impact, especially for less-privileged user groups. Second, reviewing individual decisions cannot address the higher-level decisions and systemic biases that produce unreliable and discriminatory moderation. Moreover, the DSA allows platforms discretion over substantive policies, provided they are applied in a procedurally fair way including policies that demonstrably disadvantage marginalised communities. Third, by diverting resources away from potentially more effective interventions, and making platforms’ existing moderation systems appear more legitimate, the DSA’s fetishisation of procedure could actively exacerbate or reinforce unaccountable and unfair moderation. I conclude by identifying some elements of the DSA with the potential to enable more systemic reform of social media moderation, and thereby more effectively address arbitrary censorship.