No Chat Control

• Updated

On May 11, 2022, the European Commission released a proposal for fighting child sexual abuse material (CSAM). I support efforts to prevent and combat child abuse online. However, eliminating the internationally recognized fundamental human right to privacy is not an acceptable byproduct.

Below is my feedback submitted to the European Commision during the public feedback period. I encourage European citizens to join me in submitting feedback through September 12, 2022. You are welcome to use my feedback as your own.

2022-10-23 update: The proposal is moving forward to the European Parliament and EU Council. Visit to take action with over 100 human rights organizations opposing this proposal.

2023-11-14 update: Members of the European Parliament’s Civil Liberties committee voted against attempts from EU Home Affairs officials to require mass scanning of private and encrypted messages across Europe. It was a clear-cut vote, with a significant majority of MEPs supporting the proposed position. Problematic age verification systems remain in the draft and many details still require refinement. Read the responses from EDRi and MEP Patrick Breyer.

2024-03-22 update: MEP Patrick Breyer reported EU governments resumed work on the regulation and want to adopt it by June. He released a leaked proposal from the Belnian Council Presidency that detailed the changes, which would still require mass surveillance to implement. Time to contact your MEPs again!

I oppose the current proposal to prevent and counter child sexual abuse material (CSAM) by requiring online service providers to detect and report material to public authorities.

1. The dangers of CSAM are real. The dangers of creating a surveillance state are also real. Dangers deserve proportional responses. The elimination of genuinely private communication for the vast majority of law-abiding citizens is not an acceptable or reasonable response to the dangers of CSAM.

2. The current proposal would require Facebook, Google, and other companies to become more invasive regarding privacy. Let’s examine at the implementation details, because they matter greatly in this case.

Today, end-to-end encryption prevents service operators from inspecting the content of people’s communication. The proposed detection mandate means service operators will either have to remove end-to-end encryption to inspect the content of the communication on their servers or run content detection client-side on a person’s device, turning a person’s device into a government sensory organ.

There are no other ways of implementing this proposal. We cannot use magical words and wishes to achieve goals. No technical solutions currently exist that would allow providers to offer their users end-to-end encrypted services while still complying with their detection obligations under the proposal. The result would be an elimination of encryption entirely or offering a weakened version of encryption, which would create new vulnerabilities and put more people at risk.

Regardless, the proposed detection system is a futile effort. The next technical architecture evolution for private messaging is peer-to-peer end-to-end encrypted communication that does not rely on a central service or closed-source software. Software that uses this technical architecture already exists, but has not yet been widely adopted. Abusers who wish to propagate harmful content simply will use products with this peer-to-peer architecture instead, meanwhile the vast majority of citizens will have their privacy compromised for no benefit of children.

3. Corporations (the technology providers) should not be required to do the work of law enforcement in proactively detecting, preventing, and prosecuting crime.

Other opponents of the European Commission’s proposal:


Photo by Jeremy Bezanger on Unsplash