Dispatches from the future The first year of Chat Control

Disclaimer

This thought exercise imagines the consequences of the European Union passing a proposal to scan all photos and videos sent by people for potential child abuse.

The proposal is real, but this article is extrapolative fiction.

The quotations are based on real statements made by the attributed person, but presented out of their original contexts. Their original contexts are linked.

The futuristic apps are made up, but based on ideas actively being explored.

Day 1

Reuters logo

EU law begins forcing Big Tech to tackle child abuse imagery, activist takes aim

Brussels — The Child Sexual Abuse Regulation went into effect today in the European Union. The law is seen as the EU enlisting an unexpected partner: the big tech companies it fined previously for not being private enough.

Industry adoption of end-to-end encrypted messaging technology allowed businesses and governments to communicate digitally with less concern about data leaks. The security came with a tradeoff of increased difficulty for law enforcement to investigate potential crimes.

End-to-end encryption works in part by the sender and recipient exchanging unique keys to decrypt each other’s messages. Introducing an additional decryption key for a third-party is possible and could allow law enforcement to read intercepted messages. However, the challenge of securing additional keys from also being used by adversaries prevented the technique from being used.

EU legislators’ opposition to the proposal relaxed when a different technical implementation was identified. The regulation requires messaging apps to verify photos and videos do not contain sexually explicit images involving a child before being sent encrypted. Apps can upload the unencrypted media temporarily to their moderation servers or use on-device machine learning models (AI) trained on illegal content previously obtained by law enforcement. Messaging apps must submit newly identified images to law enforcement.

Over 100 human rights organizations opposed the “chat control” proposal before it became law. Critics vowed to take the fight to the European Court of Justice for violating the right to privacy guaranteed by the Charter of Fundamental Rights of the European Union.

“In the general scanning of communications, frequent false positives cannot be avoided, even if accuracy rates are high, thereby implicating numerous innocent individuals. Given the possibility of such impacts, indiscriminate surveillance is likely to have a significant chilling effect on free expression and association, with people limiting the ways they communicate and interact with others and engaging in self-censorship,” said the UN Office of the High Commissioner for Human Rights.

“This regulation handles every photo we share with friends as potential evidence of our guilt until we have proven our innocense beyond a reasonable doubt. Today, the state surveillance of our photos is justified by child safety. Tomorrow, the scanning could expand to everything we do on our devices justified by terrorism or organized crime,” said activist and software developer Jeremiah Lee.

This story is fictional. It was never released by Reuters. See disclaimer.

Signal logo

Press Release

Signal must leave the EU to remain secure

Signal logo faded over a map of the EU composed of EU yellow dots on EU blue background

San Francisco — The Signal Foundation today discontinued service of the Signal messaging app within European Union member states with deep regret.

Signal warned EU lawmakers that forcing it to undermine its privacy guarantees would result in it leaving the EU market.

“There is no way to implement the EU Child Sexual Abuse Regulation in the context of end-to-end encrypted communications without fundamentally undermining encryption and creating a dangerous vulnerability in core infrastructure that would have global implications well beyond Europe,” said Meredith Whittaker, Signal President.

User accounts registered with phone numbers with country codes within the EU can no longer send or receive messages using Signal. The app is no longer available for download from the Apple App Store and Google Play Store for people with billing addresses within the EU. Existing installs will continue to provide access to previously sent messages, but no further updates will be provided.

This press release is fictional. It was never released by Signal. See disclaimer.

Reuters logo

“Communication chaos” for government officials as EU chat control law goes live

Brussels — Midnight negotiation led by the European Commission failed to convince the Signal Foundation to continue providing service to European members of government, security agencies and military exempted from the chat control scanning regulation.

The Commission recommended use of Signal for communication between staff and people outside the institution in 2020 after thousands of diplomatic cables were accessed illegally. It now is evaluating forking the open source project and products using the Matrix protocol, but does not expect to have a new recommendation for at least 6 months.

This story is fictional. It was never released by Reuters. See disclaimer.

Apple logo

Press Release

Apple announces powerful new safety features

Cupertino, California — Apple today introduced new child safety features in the Messages app. On-device machine learning will detect, report, and remove content that depicts sexually explicit activities involving a child. Apple pioneered this capability in 2021 and is now able to bring the innovation to market with support from governments willing to protect children.

Apple’s Communication Safety feature and Sensitive Content Analysis framework are designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device detection built from a database of known illegal images provided by National Center for Missing and Exploited Children (NCMEC) and other child safety organizations. This enables Apple to report these instances to law enforcement agencies.

We will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy and make the Internet a safer place for children and for us all.

This press release is fictional. It was never released by Apple. See disclaimer.

Meta logo

Newsroom

  • WhatsApp
  • Messenger
  • Instagram

Enabling Safer Chat for Everyone

Grayscale illustration of a photo that had been blurred, with a blue shield in front of it

Menlo Park — Meta continues its investment in public safety with industry-leading AI to accurately identity illegal content within the EU and report violations to law enforcement as required.

Keeping young people safe online has been a challenge since the Internet began. As criminals evolve their tactics, we have to evolve our defenses too.

We’ve trained AI using millions of nuanced moderation decisions made by humans. New regulation allows us to extend this protection to the images and videos sent encrypted between people.

The spread of Child Sexual Abuse Material (CSAM) challenges all communication platforms and cannot be solved by a single company. “Meta welcomes a more active role by governments and regulators in updating the rules for the Internet, so we can preserve what’s best about it—the freedom for people to express themselves and for entrepreneurs to build new things—while also protecting society from broader harms,” Mark Zuckerberg, Meta CEO, said.

WhatsApp connects hundreds of millions of people in the European Union and billions globally. People do amazing things with our platforms and we are committed to ensuring a positive experience for everyone.

Meta has invested more than $20 billion since 2016 in trust and safety initiatives. “Meta goes above and beyond to make sure that there are no portions of their network where this type of activity occurs,” said National Center for Missing and Exploited Children.

This press release is fictional. It was never released by Meta. See disclaimer.

Day 83

Distopični Časi

Police clear local mother of pedophilia allegation

Ljubljana (Reuters) — A first-time mother claims the recently enacted Child Sexual Abuse Regulation caused irreparable damage to her reputation after WhatsApp flagged photos she sent.

Police raided Melody Bostic’s home in Ljubljana weeks after she sent photos of her newborn’s diaper rash to her mother. Bostic, 21, sought advice on the urgency of the health problem.

“I constantly worry about doing the wrong thing for my baby. I can’t take her to the doctor every time I suspect something might be wrong. I never imagined police would show up to my home, take my phone, and not leave until I gave them my passcode to search through all my photos,” said Bostic.

A spokesperson for the law enforcement agency confirmed the search of Bostic occurred and that no charges were filed. The consequences of the investigation linger for Bostic. “Even though I was proven innocent, my neighbors still avoid me. They saw the flashing lights of the police cars in front of my home for hours.”

Bostic’s mother said, “Once you’re accused of something horrible, it’s hard for people to not be suspicious of you. The Ten Commandments told us to not bear false witness against our neighbors, but what about apps? They shouldn’t bear false witness either. My daughter is being punished for their lie. How is that fair?”

Meta declined to comment citing a policy to not discuss individual user situations.

This story is fictional. It was never released by Reuters. See disclaimer.

Day 203

Wired logo
The Big Story

Post-privacy era?
Not on these geeks’ watch.

In the wake of the EU’s reversal on privacy, third-party tools have restored security features forcefully removed from their favorite messaging apps, while radical new apps aspire to be regulation-proof.

Third-party WhatsApp clients are not new, but their popularity is. The developers of WasÄpp estimate a quarter of European WhatsApp users now use it. (Was is German for what.) WasÄpp looks and functions nearly identically to WhatsApp. Its popularity comes from the one feature it lacks and the one feature it adds.

The EU’s Child Sexual Abuse Regulation required messaging apps to review all images and videos for illegal content sent using their services. WasÄpp gets around the EU regulation by not providing service—it’s just an app that uses WhatsApp’s service—and only using text to share media. Lawmakers exempted text from review in what was heralded as a key privacy concession.

Instead of attaching media to a message, WasÄpp encrypts the media and uploads it to the sender’s Google Drive, Dropbox, or other cloud storage service provider. WasÄpp then sends a link to the file and the decryption key as text to the recipient. The recipient’s WasÄpp identifies the share link in the text message and automatically downloads and decrypts the image.

Once set up, users don’t notice the automation WasÄpp performs behind the scenes. The user experience is exactly like using WhatsApp before the chat control regulation—and that’s the point. Unless the EU outlaws encrypting your own data, there isn’t much it could do to stop this workaround.

A new class of apps is not waiting to respond defensively. Being ungovernable is an explicit design goal for ValidChat, Wallow Messenger, and Freebirdnet Chat. They each offer a conventional messaging app user experience distinguished by software architecture decisions intended to defy Brussels.

First, their code is open-source. Anyone can download, modify and distribute the apps. Second, the apps run in a web browser, so they don’t depend on Apple’s or Google’s app stores for distribution. Third, the apps use peer-to-peer technology to relay messages securely and anonymously without using any centralized servers.

“We took ideas from the Tor anonymizing web browsing technique, improved them, and applied them to messaging,” said a member of the legendary hacktivist group Cult of the Living Unicorn who contributes to ValidChat in their spare time.

With no central organization, no server dependencies, no distribution chokepoint, and usage camouflaged as typical Internet activity, there is little EU regulators can do to prevent Europeans from having truly private, end-to-end encrypted chat again.

When asked if criminals using their software bothered them, a software developer of Wallow Messenger said, “The harm to these victims should not be ignored. They deserve justice. We cannot, however, remove the fundamental human right to privacy for everyone in society because a small percentage of people engage in such activities.”

This story is fictional. It was never released by Wired. See disclaimer.

Day 365

Reuters logo

A year after Chat Control, EU lawmakers debate level of surveillance needed for a safer society

Brussels — The European Union’s Child Sexual Abuse Regulation, commonly referred to as chat control, went into effect a year ago. Critics and proponents of the regulation agree on its limited success in protecting children. The next step remains a divisive debate.

Reports to law enforcement of suspected abuse imagery increased 600% in the last year when messaging apps implemented the regulation’s required automated reporting of suspicious content. The increase in abuse reports did not result in an increase in charges or convictions. Only 4 criminal charges used evidence obtained from the automated submissions and all of the investigations were underway prior to the additional evidence.

“We’ve been overwhelmed with automated reports. We don’t have the budget to hire and train enough people to review the reports quickly. It feels like social networks just outsourced their moderation problem to the government,” said a law enforcement officer requesting anonymity.

The EU Centre overseeing the regulation admitted most reports were either harmless, such as vacation photos of nude children playing at a beach, or a legal gray area of teenagers sending intimate, but consensual, photos of themselves.

Critics of chat control who feel the law went too far in restricting privacy cite the lack of increased convictions as a reason to repeal a misguided law, while proponents cite the failure as proof the law did not go far enough in aiding law enforcement.

“Crime moves fast and we must move faster. We suspect criminals now use ‘black bubble’ messaging apps to avoid interception,” said another law enforcement officer requesting anonymity.

Messaging apps using peer-to-peer technology inspired by Tor’s “dark web” became known as “black bubble” apps, a reference to Apple’s visual treatment of displaying messages sent using iMessage with a blue background and messages sent using older SMS technology to Android users with a green background.

Member states in the Council of the EU last week debated an expansion of surveillance capabilities beyond messaging apps and child abuse investigations. First proposed during Sweden’s EU Presidency in spring 2023, the updated regulation could require software service providers give law enforcement access to unencrypted user data when ordered by a judiciary.

Human rights organizations opposed the proposal, citing technical challenges in implementing such a policy and concerns about abuse.

“So-called ‘lawful access by design’ is ‘insecurity by design.’ There is no way to add backdoor access to encryption or an on-device snitch for police that cannot also be potentially exploited by a hostile nation state or criminal hacker. This is not a privacy fetish. It’s a technical reality,” said Jeremiah Lee, a data privacy engineer.

Debate over the balance between the human right to privacy and government’s ability to provide justice to victims is likely to continue for generations. For some, the debate is a distraction from investigating non-technical solutions to non-technical problems.

“What we’re talking about is a kind of self-negating paradox. You cannot do mass surveillance privately, full stop,” said Meredith Whittaker, Signal President.

This story is fictional. It was never released by Reuters. See disclaimer.

Postscript

Take action if this is not the future you want to live in.

  1. Write your members of European Parliament, your country’s permanent representation in the European Council, and your country’s leaders to express your opposition to chat control. Then, vote accordingly.

  2. Donate to European Digital Rights (EDRi) and your country’s digital rights organization.

  3. Use Signal to chat with your friends and family. Donate to support it.

Further reading

Credits

Written, designed, and coded by
Jeremiah Lee

Special thanks to
Jonathan Cowperthwait
Marcel Waldvogel

Translations

Deutsch: Marcel Waldvogel

Español: CL/Ln. Humberto J. Normey, Lions

Italiano: qyd0ro

Polski: Bartłomiej Garbiec

Svenska: Mattias Axell

Can you help translate this to Française, Nederlands, Vlaams, Esperanto, or another language? Contact me!

Logos and names of companies are trademarks of their respective owners.
Used for the fair-use purposes of journalistic investigation and cultural criticism, review, caricature, and parody.

Open source software used:
Hero Icons (MIT license), Monaspace Krypton (SIL Open Font License 1.1), Old Standard TT (SIL Open Font License 1.1), Tailwind CSS (MIT license)

Meta press release image from Meta Newsroom

Headlines from the first year of Chat Control by Jeremiah Lee is licensed under CC BY 4.0

Last updated: 2024-09-15T21:12:20+0000

←︎ More writing by Jeremiah