The EU Wants to Scan Your Messages
For the children
The European Union has been trying to pass a law that would require every chat app, every messaging service, and every email provider to scan your private messages for child sexual abuse material.
They’ve been trying since 2022. It keeps failing. And it keeps coming back.
The latest version is more specific. The proposal is officially called the “Regulation to Prevent and Combat Child Sexual Abuse.” Informally, everyone calls it Chat Control.
And yes, it’s exactly what it sounds like.
What they’re proposing
The core of it: messaging services would be legally required to scan all messages for known CSAM, for new and unknown CSAM, and for grooming behavior. All of them. Not just the flagged accounts. Not just the suspicious ones. Everyone.
That includes end-to-end encrypted services. Signal. WhatsApp. iMessage.
The technical term they landed on is “upload moderation.” Which means scanning your message on your device before it gets encrypted and sent. They rebranded it to avoid saying “client-side scanning,” but that’s exactly what it is.
Apple tried building something like this in 2021. They called it NeuralHash. It was supposed to scan photos on your iPhone before they got uploaded to iCloud. The backlash was massive. Security researchers tore it apart. Apple killed the project in late 2022 and said it couldn’t be done without creating serious privacy and security risks.
Apple, with unlimited resources and full control of the hardware, concluded it wasn’t safe. The EU thinks it can mandate it for every chat app.
The false positive problem
Billions of messages are sent in the EU every day. Even with a 99% accuracy rate, which is generous for AI-based grooming detection, you’d get millions of false positives. Per day.
Millions of innocent messages flagged and reviewed. Parents sharing photos of their kids at the beach. Teenagers sending each other pictures. Doctors discussing cases. All of it fed into a review pipeline.
An open letter signed by hundreds of scientists and security researchers in 2023 warned that the technology to reliably detect CSAM without massive false positives simply does not exist. Not “needs improvement.” Does not exist.
The Commission’s own impact assessment was criticized for overstating what current technology can actually do.
Who wants this
The original proposal came from Ylva Johansson, the EU Commissioner for Home Affairs under the 2019 Commission. She pushed hard. Spain was the most aggressive supporter in the Council. Leaked documents showed Spain favored scanning even inside encrypted messages, no exceptions.
Ireland, Italy, Hungary, Romania, Bulgaria, Greece, and Croatia have generally supported it. France had a complicated position, sometimes in favor with caveats.
Law enforcement agencies and Europol backed it. So did Thorn, an anti-CSAM nonprofit that lobbied extensively in Brussels.
The argument is always the same. Child safety. No one can argue against protecting children without sounding like a monster. That’s the point. It’s a framing designed to make opposition politically radioactive.
Who’s blocking it
Germany. The German government opposed client-side scanning, and the Bundestag passed a resolution against it.
The Netherlands flipped to oppose after the Dutch parliament instructed its government to vote against. Austria, Poland, the Czech Republic, Luxembourg, and Slovenia also pushed back. Together, they formed a blocking minority in the Council.
The European Parliament took a more privacy-protective stance than both the Commission and the Council. The LIBE Committee, Civil Liberties, voted to exclude end-to-end encrypted messages from detection orders entirely. They also limited scanning to known CSAM only, not AI-based grooming detection, and required reasonable suspicion before any scanning could happen.
The European Data Protection Supervisor issued a critical opinion. So did the European Data Protection Board. Both said the proposal, as written, was incompatible with fundamental rights.
The encryption question
There is no way to scan end-to-end encrypted messages without breaking end-to-end encryption. This is not a political opinion.
If you build a system that scans content before encryption, you’ve created a vulnerability. A backdoor. It doesn’t matter what you call it. If it exists, it can be exploited. By hackers. By governments. By anyone who finds it.
Signal’s president Meredith Whittaker said publicly that Signal would leave the EU rather than comply. WhatsApp’s head Will Cathcart said Meta would not weaken WhatsApp’s encryption. Threema, the Swiss messenger, took the same position.
So the EU has a choice. Pass this law and watch every serious encrypted messenger either leave or get banned. Or don’t pass it.
It keeps coming back
The Belgian presidency tried in early 2024. Failed. The Hungarian presidency tried in late 2024. Pushed hard, circulated multiple compromise texts, tried to carve out exemptions for audio calls while keeping text and image scanning. Still failed. The blocking minority held.
But it didn’t die.
There’s a temporary regulation that allows platforms to voluntarily scan for CSAM. It keeps getting extended. Every time it’s about to expire, it creates political pressure: “If we don’t pass the permanent regulation, platforms will stop scanning entirely.” It’s a ratchet. The temporary measure justifies the permanent one.
And every new Council presidency picks it up again. Tweaks the language. Rebrands the scanning. Adds a carve-out here, removes a safeguard there. Tries to find the exact combination of words that gets past the blocking minority.
The push for European digital sovereignty makes this more complicated. The EU simultaneously argues for data protection as a competitive advantage AND for mandatory mass scanning of private communications. Those two positions are not really compatible.
But that hasn’t stopped anyone.
Targets won’t be affected
People who distribute CSAM professionally use custom servers, steganography, VPNs routed through non-EU jurisdictions, or encrypted channels that no EU regulation can touch. They will not be using WhatsApp. They are not sending unencrypted messages through Gmail.
The only people affected by this regulation would be regular citizens. Most likely.
The Court of Justice of the EU has repeatedly ruled against general, indiscriminate surveillance of communications. The current proposal is essentially that, with extra steps. If it passes, it will almost certainly face legal challenges.
But in the meantime, the infrastructure gets built.
Options
The EU could invest in better-funded law enforcement with targeted investigations. It could require platforms to improve reporting mechanisms. It could fund victim support organizations. It could go after the hosting infrastructure where CSAM lives, not the messages of 450 million Europeans.
All of those would be more effective than scanning every message on the continent, right? I’d say so.
But none of them make for a good press release that says “we’re protecting children.”
The Bottom Line
The EU is the same institution that gave us GDPR. The same one that fined Big Tech billions for privacy violations. The same one I’ve written about positively when it comes to data protection.
And now it wants to read your messages. Like other countries do, if I may add that.
But for the children, of course. I have children. I want them to be safe. I don’t think this is the solution. Do you?



