EU Ministers Vote in 2 Weeks on Law That Would Scan Private Messages Before Encryption to Detect Child Abuse

By
Yves Tussaud
5 min read

Europe at a Crossroads: Child Safety Law Tests the Future of Digital Privacy

An EU plan to force tech firms to scan private messages is heading for a decisive vote, with critics warning it could turn Europe’s phones into the backbone of a vast surveillance system.

BRUSSELS — On October 13, justice ministers from across the European Union will gather in a closed room and face a dilemma that could affect half a billion people. At its core, the question looks simple: Should companies be forced to scan private messages for child abuse material? In practice, it’s anything but simple.

The proposal has split Europe’s usual digital rights consensus wide open. Supporters argue that automated scanning is vital to protect children from online predators. Opponents insist it would gut encryption and expose everyone’s private lives to government scrutiny.

What’s on the table isn’t just a law. It’s a redesign of the internet’s plumbing.

The draft Child Sexual Abuse Regulation—nicknamed “Chat Control” by critics—would require so-called “client-side scanning.” That means your phone or laptop would check every message and photo before encryption scrambles it. In effect, private conversations would pass through an invisible checkpoint right on your own device.

One senior EU official, speaking anonymously because negotiations are still ongoing, didn’t mince words: “You can’t pre-scan messages and still claim they’re end-to-end encrypted. It changes the category entirely.”

How the Law Would Work

Commissioner Ylva Johansson introduced the plan back in May 2022. It would let courts issue “detection orders,” legally obliging messaging apps, email services, and cloud storage providers to hunt for three things: known abuse images, new abuse material, and grooming behavior.

Ylva Johansson (gstatic.com)
Ylva Johansson (gstatic.com)

On paper, the system sounds tidy. Algorithms compare files to databases of known images, machine learning tries to spot new abusive content, and pattern-matching tools look for conversations that resemble grooming. Suspicious hits would flow to a new EU Centre, which would double-check the findings before alerting police.

That might sound manageable for email providers or social networks, many of which already scan for illegal content. But end-to-end encrypted apps like WhatsApp, Signal, and iMessage can’t be scanned on company servers—messages there are scrambled from the moment they’re sent. To comply, those apps would have to scan on your phone itself.

And that’s where politics collides with math.

Red Flags Everywhere

Europe’s own legal watchdogs are uneasy. Both the European Data Protection Supervisor and the European Data Protection Board warned that blanket scanning could breach fundamental rights to privacy and data protection under the EU Charter.

Even the Council Legal Service, usually discreet, questioned whether such orders could survive the EU’s proportionality tests. The bloc’s top courts have repeatedly struck down indiscriminate monitoring of communications.

Technical experts worry about something more concrete. If your device constantly inspects your messages before sending them, then by design, it can’t be fully trusted. Hackers, authoritarian governments, or anyone able to exploit the scanning system would gain a new doorway into private lives.

As one analysis by digital rights groups put it: once the scanning pipeline exists, nothing prevents lawmakers from widening its scope. Today it’s child abuse material. Tomorrow it could be terrorism, copyright, or dissent.

The Case for Pushing Ahead

Still, supporters insist the risks of inaction are too high. Police across Europe say tips from tech firms already play a critical role in investigations. If encryption spreads without scanning, they warn, authorities will “go dark,” losing sight of crimes happening in plain sight but behind unbreakable encryption.

Backers say the plan isn’t mass surveillance but a targeted tool. Detection orders would need judicial approval. The EU Centre would filter false alarms before police ever see them. And the scope would be limited to crimes against children.

Critics aren’t convinced. Running checks on billions of private communications in order to find thousands of illegal ones will inevitably produce errors. Even a system that’s 99% accurate would generate thousands of false accusations every day. That means innocent parents, journalists, or abuse survivors could see their private conversations flagged to the authorities.

Grooming detection raises even bigger questions. Unlike image-matching, spotting predatory conversations requires analyzing tone, context, and subtle language cues. Current AI tools struggle with that nuance at scale, and experts doubt safeguards could prevent abuse of such a system.

October’s Tipping Point

The Danish presidency of the Council is pushing hard to revive the stalled proposal with compromise wording. But unity is elusive. Germany and Luxembourg have wavered, sometimes voicing privacy concerns, sometimes softening. Whether enough states will line up for the required qualified majority is anyone’s guess.

Ministers could walk away with one of three outcomes:

  • A watered-down version that limits detection to known images, leaving encrypted apps untouched for now.
  • A delay, pushing the decision into the hands of a future Council while voluntary scanning continues.
  • Or a full embrace of client-side scanning, sparking instant lawsuits and forcing major messaging apps to choose between compliance, pulling out of the EU, or blocking certain features.

Why It Matters Beyond Europe

Whatever Europe decides, the ripple effects will spread far wider. If the EU demands client-side scanning, it will create the world’s first legally mandated surveillance layer built directly into personal devices. Other governments could follow—or resist. Tech firms might end up building different versions of their apps for different regions. Global encrypted messaging networks could fracture along borders.

For everyday users, little would change at first glance. Messages would still arrive. Photos would still send. But under the hood, each word, image, or video would be quietly inspected before encryption cloaks it.

That reality has civil society groups scrambling. Activists are lobbying governments. Privacy groups are readying lawsuits. Security researchers keep warning about the risks. Yet political pressure to “do something” about child abuse online remains fierce.

Europe now faces a stark question: will it embed surveillance into the foundation of its digital society in the name of protecting children, or decide that some lines of privacy should never be crossed?

We’ll know more soon. Justice ministers meet in Brussels on October 13. What they choose could set the direction of global digital privacy for decades.

You May Also Like

This article is submitted by our user under the News Submission Rules and Guidelines. The cover photo is computer generated art for illustrative purposes only; not indicative of factual content. If you believe this article infringes upon copyright rights, please do not hesitate to report it by sending an email to us. Your vigilance and cooperation are invaluable in helping us maintain a respectful and legally compliant community.

Subscribe to our Newsletter

Get the latest in enterprise business and tech with exclusive peeks at our new offerings

We use cookies on our website to enable certain functions, to provide more relevant information to you and to optimize your experience on our website. Further information can be found in our Privacy Policy and our Terms of Service . Mandatory information can be found in the legal notice