The Digital Services Act (DSA) establishes a Transparency Database (DSA-TDB) and requires platforms to submit Statements of Reasons (SoRs) explaining their moderation decisions. According to the DSA, the database is intended to serve three objectives: ensuring transparency, enabling scrutiny over content-moderation decisions, and monitoring the spread of illegal content. From a communications-policy perspective, this article evaluates the DSA-TDB’s ability to meet these objectives. We go beyond data-centric analyses and critically assess the database’s design, including its reporting schema, guidelines and its data-access modalities, showing that these design choices impede the attainment of the stated objectives even before any data are analyzed. In addition, we run exploratory regressions on 3.52 billion SoRs, submitted by major social media platforms over a 20-month period, to assess whether the database can answer policy- and research-relevant questions. By analyzing restriction intensity, classification of illegality, and moderation speed, we illustrate the theoretical-analytical leverage the data could offer and how design and quality limitations constrain this potential. Mindful of data-quality constraints, the analyses assess the database’s regulatory utility rather than foreground substantive findings. Overall, while the DSA-TDB marks a step towards transparency, significant shortcomings remain: limited usability and accessibility, absence of key data for scrutiny and for monitoring the spread of illegal content, and concerns about consistency, reliability and validity of platform reporting. Consequently, the database falls short of its objectives. Alongside recommendations for improving the database, we argue for the necessity of reviewing the regulatory objectives themselves.
