Policy

Cybersecurity for Democracy advocates for comprehensive transparency for digital platforms with practical and appropriate measures for preserving privacy of users. Major elements of current transparency proposals include:

Digital platform transparency: a timeline

Full version.

JANUARY 2022. Knight Institute publishes “A Safe Harbor for Platform Research,” a policy proposal that establishes legal protections for researchers and journalists that study Facebook and other platforms.

DECEMBER 2021. C4D and partners release “A Standard for Universal Ad Transparency,” published by First Amendment Institute, which describes in detail how this policy proposal would work.

DECEMBER 2021. C4D and Belgium’s KU Leuven expose Facebook’s poor performance in identifying political ads in “An Audit of Facebook’s Political Ad Policy Enforcement.”

SEPTEMBER 2021. C4D’s Laura Edelson calls for universal ad transparency in her testimony before the U.S. House Science Committee and UK Parliament.

AUGUST 2021. FTC sends a letter to Facebook, stating that its claims that C4D’s research violates its consent decree are inaccurate.

AUGUST 2021. C4D publishes New York Times op ed about losing Facebook data access.

AUGUST 2021. Facebook makes good on the cease & desist by suspending C4D researchers’ accounts, which sets off a firestorm of media coverage around the globe.

OCTOBER 2020. Facebook responds by sending a cease & desist letter to C4D that calls for disabling Ad Observer, citing violations of its terms of service.

SEPTEMBER 2020. C4D launches Ad Observatory, a platform that analyzes data from Ad Observer, Facebook Ad Library, and other sources, and shares trends with the public.

JULY 2020. C4D launches Ad Observer, which allows Facebook users to safely share data with C4D about the ads they see.

MAY 2018. Facebook launches Ad Library, which offers access to limited data about political ads that run on the platform.

Platform Transparency

  • Universal Digital Advertising Transparency: Require platforms to provide comprehensive digital advertising transparency. Data should include who is paying, how much, certain targeting information, impressions, reach and must be made available publicly, without restrictions.
  • High Engagement Content: Mandate transparency around "reasonably public," high engagement content.
  • Algorithmic Transparency: Call for platforms to provide transparency around their algorithmic processes and reasoning.

Reform Liability Immunity for Tech Companies

  • Update Section 230: More clearly define what constitutes “third party speech” that should be covered vs. platform behavior (such as algorithmic amplification, recommender systems, paid/targeted content and auto-generation of content) which may be beyond the scope of Section 230.
  • Empirical Evidence: Provide empirical evidence to support how these various platform tools work, how liability might be applied, and what enforcement could look like.

Data Access For Researchers / Right to Research

  • Legal Protection for Researchers: Find basis in existing law to protect researchers challenging tech power; where there is not adequate law, work towards developing new law which better protects researchers.  Approaches may include establishing a “safe harbor” for researchers and journalists to investigate the operations of platforms as long as they handle data responsibly and adhere to professional and ethical standards.
  • Clean room protections: Provide vetted researchers access to sensitive data through "clean rooms," or a controlled environment.
  • Public data access: Work with regulators in Europe and the U.S. toward establishing more data sets which don’t require vetted access.