This document is a revised and updated version of the technical document Platforms’ policies on climate change misinformation first published in 2023. Back then, we mapped how five major platforms, Facebook, Instagram, YouTube, TikTok, and X/Twitter, addressed climate change disinformation through their policies and enforcement systems.
At that time, the Digital Services Act (DSA) had not yet entered into force. Therefore, platforms had no legal obligation to address climate disinformation in the EU, and each one did so in its own way, with varying levels of ambition and effectiveness.
The DSA is now in force, but climate disinformation is not explicitly recognised as a “systemic risk” under Articles 34–35. This omission limits its inclusion in platform risk assessments, mitigation efforts, and transparency reports, and leaves enforcement largely discretionary. Without specific guidance or mandates, platforms retain wide latitude in deciding whether and how to address climate harms.
With this regulatory gap in mind, we set out to examine how platform responses to climate disinformation evolved or failed to evolve between 2023 and 2025, and what their policies look like in practice across Facebook, Instagram, YouTube, TikTok, X, and, newly included in this edition, LinkedIn. This update aims therefore to:
- Refresh memory by documenting what actions were in place in 2023.
- Measure progress or regression, both in public commitments and enforcement practices.
- Support renewed pressure on platforms to address climate disinformation more seriously.
- Encourage EU regulators to explicitly recognise climate disinformation as a systemic risk under the DSA , and ensure future guidance, risk mitigation requirements, and platform transparency reflect that urgency.
As the climate crisis accelerates, it is crucial to demand that very large online platforms (VLOPs) (take meaningful, measurable action to reduce the spread and amplification of harmful climate narratives, whether through misleading organic content, monetised falsehoods, algorithmic echo chambers, or paid advertisements.