Cybersecurity for Democracy advocates for comprehensive transparency for digital platforms with practical and appropriate measures for preserving privacy of users. Major elements of current transparency proposals include:

Digital platform transparency: a timeline

Full version.

JANUARY 2022. Knight Institute publishes “A Safe Harbor for Platform Research,” a policy proposal that establishes legal protections for researchers and journalists that study Facebook and other platforms.

DECEMBER 2021. C4D and partners release “A Standard for Universal Ad Transparency,” published by First Amendment Institute, which describes in detail how this policy proposal would work.

DECEMBER 2021. C4D and Belgium’s KU Leuven expose Facebook’s poor performance in identifying political ads in “An Audit of Facebook’s Political Ad Policy Enforcement.”

SEPTEMBER 2021. C4D’s Laura Edelson calls for universal ad transparency in her testimony before the U.S. House Science Committee and UK Parliament.

AUGUST 2021. FTC sends a letter to Facebook, stating that its claims that C4D’s research violates its consent decree are inaccurate.

AUGUST 2021. C4D publishes New York Times op ed about losing Facebook data access.

AUGUST 2021. Facebook makes good on the cease & desist by suspending C4D researchers’ accounts, which sets off a firestorm of media coverage around the globe.

OCTOBER 2020. Facebook responds by sending a cease & desist letter to C4D that calls for disabling Ad Observer, citing violations of its terms of service.

SEPTEMBER 2020. C4D launches Ad Observatory, a platform that analyzes data from Ad Observer, Facebook Ad Library, and other sources, and shares trends with the public.

JULY 2020. C4D launches Ad Observer, which allows Facebook users to safely share data with C4D about the ads they see.

MAY 2018. Facebook launches Ad Library, which offers access to limited data about political ads that run on the platform.

Universal digital advertising transparency: Require platforms to provide universal digital advertising transparency. Data should include who is paying, how much, targeting, impressions, reach and must be made available publicly, without restrictions.

Safe harbor: Establish a “safe harbor,” or legal protections, for researchers and journalists to investigate the operations of platforms as long as they handle data responsibly and adhere to professional and ethical standards.

High engagement content: Mandate transparency around "reasonably public," high engagement content.

Clean room protections: Provide vetted researchers access to sensitive data through "clean rooms," or a controlled environment.

Algorithmic transparency: Call for platforms to provide transparency around their algorithmic processes and reasoning.