Stop Scanning Me

• Updated

On May 11, 2022, the European Commission released a proposal for fighting child sexual abuse material (CSAM). I support efforts to prevent and combat child abuse online. However, eliminating the internationally recognized fundamental human right to privacy is not an acceptable byproduct.

Below is my feedback submitted to the European Commision during the public feedback period. I encourage European citizens to join me in submitting feedback through September 12, 2022. You are welcome to use my feedback as your own.

2022-10-23 update: The proposal is moving forward to the European Parliament and EU Council. Visit StopScanningMe.eu to take action with over 100 human rights organizations opposing this proposal.


I oppose the current proposal to prevent and counter child sexual abuse material (CSAM) by requiring online service providers to detect and report material to public authorities.

1. The dangers of CSAM are real. The dangers of creating a surveillance state are also real. Dangers deserve proportional responses. The elimination of genuinely private communication for the vast majority of law-abiding citizens is not an acceptable or reasonable response to the dangers of CSAM.

2. The current proposal would require Facebook, Google, and other companies to become more invasive regarding privacy. Let’s examine at the implementation details, because they matter greatly in this case.

Today, end-to-end encryption prevents service operators from inspecting the content of people’s communication. The proposed detection mandate means service operators will either have to remove end-to-end encryption to inspect the content of the communication on their servers or run content detection client-side on a person’s device, turning a person’s device into a government sensory organ.

There are no other ways of implementing this proposal. We cannot use magical words and wishes to achieve goals. No technical solutions currently exist that would allow providers to offer their users end-to-end encrypted services while still complying with their detection obligations under the proposal. The result would be an elimination of encryption entirely or offering a weakened version of encryption, which would create new vulnerabilities and put more people at risk.

Regardless, the proposed detection system is a futile effort. The next technical architecture evolution for private messaging is peer-to-peer end-to-end encrypted communication that does not rely on a central service or closed-source software. Software that uses this technical architecture already exists, but has not yet been widely adopted. Abusers who wish to propagate harmful content simply will use products with this peer-to-peer architecture instead, meanwhile the vast majority of citizens will have their privacy compromised for no benefit of children.

3. Corporations (the technology providers) should not be required to do the work of law enforcement in proactively detecting, preventing, and prosecuting crime.


Other opponents of the European Commission’s proposal:

Related:

Photo by Jeremy Bezanger on Unsplash