All major social media platforms use content moderation as a tool to prevent harmful content from spreading on their systems. To quantify the impact of content moderation, we propose the metric of prevented dissemination. To understand the practical limitations that content moderation systems face, we conducted an empirical measurement study of public posts from news providers on Facebook in English, Ukrainian, and Russian. We analyzed how quickly posts accrue engagement, finding large asymmetries of engagement over content and time, and use our measurements to build a model to predict a post’s future engagement. We also observed the timing of (rare) post removals. Using our prevented dissemination metric, we estimate that removals prevented only 24–30% of the posts’ predicted engagement. Our lens of prevented dissemination provides an outcome-based metric to judge the impact of content moderation in practice and could help builders of moderation systems prioritize content for review.