On May 11, 2022, the European Commission released a proposal for fighting child sexual abuse material (CSAM). I support efforts to prevent and combat child abuse online. However, eliminating the internationally recognized fundamental human right to privacy is not an acceptable byproduct.
Below is my feedback submitted to the European Commision during the public feedback period. I encourage European citizens to join me in submitting feedback through September 12, 2022. You are welcome to use my feedback as your own.
2022-10-23 update: The proposal is moving forward to the European Parliament and EU Council. Visit StopScanningMe.eu to take action with over 100 human rights organizations opposing this proposal.
2023-11-14 update: Members of the European Parliament’s Civil Liberties committee voted against attempts from EU Home Affairs officials to require mass scanning of private and encrypted messages across Europe. It was a clear-cut vote, with a significant majority of MEPs supporting the proposed position. Problematic age verification systems remain in the draft and many details still require refinement. Read the responses from EDRi and MEP Patrick Breyer.
I oppose the current proposal to prevent and counter child sexual abuse material (CSAM) by requiring online service providers to detect and report material to public authorities.
1. The dangers of CSAM are real. The dangers of creating a surveillance state are also real. Dangers deserve proportional responses. The elimination of genuinely private communication for the vast majority of law-abiding citizens is not an acceptable or reasonable response to the dangers of CSAM.
2. The current proposal would require Facebook, Google, and other companies to become more invasive regarding privacy. Let’s examine at the implementation details, because they matter greatly in this case.
Today, end-to-end encryption prevents service operators from inspecting the content of people’s communication. The proposed detection mandate means service operators will either have to remove end-to-end encryption to inspect the content of the communication on their servers or run content detection client-side on a person’s device, turning a person’s device into a government sensory organ.
There are no other ways of implementing this proposal. We cannot use magical words and wishes to achieve goals. No technical solutions currently exist that would allow providers to offer their users end-to-end encrypted services while still complying with their detection obligations under the proposal. The result would be an elimination of encryption entirely or offering a weakened version of encryption, which would create new vulnerabilities and put more people at risk.
Regardless, the proposed detection system is a futile effort. The next technical architecture evolution for private messaging is peer-to-peer end-to-end encrypted communication that does not rely on a central service or closed-source software. Software that uses this technical architecture already exists, but has not yet been widely adopted. Abusers who wish to propagate harmful content simply will use products with this peer-to-peer architecture instead, meanwhile the vast majority of citizens will have their privacy compromised for no benefit of children.
3. Corporations (the technology providers) should not be required to do the work of law enforcement in proactively detecting, preventing, and prosecuting crime.
Other opponents of the European Commission’s proposal:
- CSA survivor and privacy advocate Alexander Hanff explains how intrusive Internet monitoring deprives survivors of safe spaces and disincentivises survivors from seeking help
- EDRi: European Commission’s online CSAM proposal fails to find right solutions to tackle child sexual abuse
- EFF: The EU’s new message-scanning regulation must be stopped, Part 2
- CDT: an appeal to EU lawmakers to urgently revise the approach
- DiEM25: EU Commission plans to end encryption and make Big Tech the gatekeeper
- Politico: Europe’s online child abuse law will make us all less safe
- Tutanota: EU Commission is planning what Apple stopped after backslash from privacy groups: Automatic CSAM scanning of your private communication.
- Privacy International: International Safe Abortion Day: no safety without privacy