CCDH’s STAR Framework is a blueprint for policymakers to combat online harms and fortify democracy in the digital age.
Safety by Design is a principled approach to the design of technology products and social media platforms that promotes user health, well-being, human rights, and civil liberties. Safety cannot be achieved without transparency into the algorithmic systems and economic incentives driving platform features and behaviors.
Reorient the product design process and empower users
Create robust systems for tackling online harms and protecting minors online.
Transparency is a social media company’s obligation to disclose accurate and accessible information about algorithms, product design, platform decisions, and economics, particularly around advertising.
Mandate platforms publish standardized transparency reports and comprehensive content libraries
Establish an independent digital regulator that can enforce transparency, conduct audits, and require disclosures about product design.
Create a research program to study online harms, certify and protect independent researchers, and protect data privacy.
Accountability and Responsibility underscore that platforms must take ownership of their decisions and be responsive to users and democratic institutions. Governments must implement economic consequences for inaction to counterbalance the profit motives that lead companies to deprioritize safety and transparency.
Reform Section 230 to address platform behavior, not user speech.
Impose consequences for harmful content algorithmically amplified or monetized and limit protections for illegal content
Establish an independent digital regulator dedicated to online platforms that can conduct robust independent investigations and fine platforms that fail to comply with the law.