EU child safety push stalls as ePrivacy derogation expires age verification app hacked and CSA Regulation stuck in trilogue

Summary: Europe's effort to protect children online has collided with its own privacy architecture. The ePrivacy derogation allowing voluntary CSAM scanning expired on April 3 after Parliament voted 311-228 to reject its extension, the EU's new age verification app announced April 15 was hacked in under two minutes, and the CSA Regulation (“Chat Control”) remains stuck in trilogue with a July deadline. The ECHR has ruled encryption backdoors violate fundamental rights, while the GDPR, DSA, and proposed CSA Regulation each require knowing whether a user is a child, which itself requires collecting the data that privacy law says you cannot collect about children.

On April 3, the European Parliament voted 311 to 228 to let its temporary ePrivacy derogation expire. That derogation had allowed platforms such as Meta, Google, and Microsoft to voluntarily scan private messages for child sexual abuse material without violating EU privacy law. When it lapsed, the legal basis for those scans disappeared. Twelve days later, the European Commission announced a new privacy-preserving age verification app designed to protect children online. Researchers hacked it in under two minutes. Between the expired law and the broken app sits the entire problem: Europe wants to protect children from online exploitation, but every tool it builds to do so runs into the privacy architecture it spent a decade constructing. The result is a regulatory system at war with itself, where the mechanisms needed to find abused children require collecting exactly the data that EU law says you cannot collect about children.

The scanning gap

The ePrivacy derogation was introduced in 2021 as a stopgap. The European Commission had proposed the Child Sexual Abuse Regulation, known formally as the CSA Regulation and informally as Chat Control, which would mandate that platforms detect and report CSAM in private messages, including end-to-end encrypted ones. The regulation was supposed to replace the voluntary framework within three years. It did not. Trilogue negotiations between the Parliament, Council, and Commission have dragged on since 2022, with the next scheduled meeting on May 4 and a target of reaching political agreement by July. In the meantime, the derogation expired. The National Centre for Missing and Exploited Children in the United States, which processes the majority of global CSAM reports, warned that the lapse would cause a measurable drop in referrals from European platforms. Meta confirmed it had paused voluntary scanning in the EU. The Parliament's position is that the derogation was incompatible with the fundamental right to privacy of communications. The child safety organisations' position is that the Parliament just made it legal for platforms to ignore abuse material sitting in their systems.

The CSA Regulation as proposed by the Commission would require platforms to use detection orders issued by a new EU Centre to scan messages for known CSAM, new CSAM, and grooming behaviour. The Parliament stripped out the most contentious elements: it rejected scanning of end-to-end encrypted messages, limited detection to known material using hash-matching technology, and excluded real-time communications. The Council, led by a rotating presidency that has pushed harder on law enforcement access, wants broader scanning powers including for unknown material and grooming. The distance between the two positions is not a detail to be negotiated away. It is a fundamental disagreement about whether private communications can be systematically monitored to protect children, and the European Court of Human Rights has already indicated where it stands.

The encryption wall

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

In February, the ECHR ruled in Podchasov v. Russia that requiring platforms to weaken or backdoor end-to-end encryption violates Article 8 of the European Convention on Human Rights, the right to respect for private life and correspondence. The ruling was directed at a Russian law compelling messaging services to provide decryption keys to the FSB, but its logic applies directly to the CSA Regulation's proposed detection orders. If a platform cannot scan encrypted messages without weakening the encryption, and weakening the encryption violates fundamental rights, then the regulation cannot mandate what its authors intended it to mandate. Signal's president, Meredith Whittaker, said the organisation would leave the EU rather than comply with any law requiring it to compromise its encryption protocol. Apple disabled its Advanced Data Protection feature for users in the United Kingdom after the British government issued a technical capability notice under the Investigatory Powers Act demanding backdoor access to iCloud data. The encryption debate is no longer theoretical. Companies are already making jurisdictional decisions based on where governments demand access to private communications.

The European Data Protection Board and the European Data Protection Supervisor have both issued opinions warning that the CSA Regulation as drafted by the Commission would be disproportionate and incompatible with EU fundamental rights. The EDPS specifically flagged that client-side scanning, the technique proposed as an alternative to breaking encryption by scanning content on the device before it is encrypted, still constitutes mass surveillance because it processes every message to identify the illegal ones. The distinction between scanning before encryption and scanning after encryption is technically meaningful but legally immaterial if the outcome is that every private message is analysed by an automated system. The Parliament's negotiating position reflects this analysis. The Council's does not.

The age verification paradox

While the CSA Regulation stalls, individual member states have moved ahead with age-based restrictions. France prohibits children under 15 from accessing social media without parental consent. Spain has set the threshold at 16. Greece will ban social media for under-15s from 2027. Austria's threshold is 14. Norway plans to ban social media for under-16s and is developing a national age verification system to enforce it. Europe's accelerating push for social media age limits has produced a patchwork of national laws with no common enforcement mechanism, which is precisely the problem the EU age verification app was supposed to solve.

The Commission's app, announced on April 15, was designed to verify a user's age without revealing their identity to the platform, a zero-knowledge proof system that would confirm someone is over a given age threshold without transmitting their date of birth, name, or any other personal data. It was presented as the technical solution to the paradox of verifying age without collecting age data. Security researchers demonstrated within two minutes of its release that the app's verification process could be bypassed, undermining the credibility of the one tool the Commission had offered as proof that privacy-preserving child safety enforcement was technically feasible. The EU's new privacy-preserving age verification app was meant to demonstrate that the trade-off between child protection and data minimisation could be resolved through engineering. Its immediate failure demonstrated the opposite.

The legal collision

The Digital Services Act, which entered full application in 2024, requires platforms to assess and mitigate systemic risks to minors under Article 28, including exposure to harmful content, manipulation through interface design, and processing of personal data in ways that exploit children's vulnerabilities. The DSA's guidelines instruct platforms to implement age-appropriate protections but do not specify how platforms should determine a user's age. The GDPR sets the age of digital consent at 16, with member states permitted to lower it to 13, and requires parental consent for processing children's data below that threshold. GDPR fines increasingly target child data violations, with regulators across Europe treating children's privacy as an enforcement priority. But to enforce age-specific protections, platforms must first determine who is a child, and determining who is a child requires collecting or inferring personal data about every user, including the adults who have a right not to be age-checked.

This is the circularity at the centre of Europe's child safety framework. The GDPR says you cannot process children's data without heightened protections. The DSA says you must protect children from harmful content. The CSA Regulation says you must detect abuse material in private messages. Each obligation requires knowing whether a given user is a child. Knowing whether a given user is a child requires processing their personal data. Processing their personal data to determine their age may itself violate the data minimisation principles that the GDPR enshrines. The age verification app was supposed to cut through this knot. It was broken on arrival. The ePrivacy derogation was supposed to buy time for the CSA Regulation. It expired without a replacement. The CSA Regulation was supposed to create a harmonised framework. It remains stuck between a Parliament that will not accept mass surveillance and a Council that will not accept a regulation without scanning powers.

The July target

The trilogue negotiators have set July as the deadline for political agreement on the CSA Regulation. The compromise proposals circulating in Brussels would limit mandatory detection to unencrypted platforms and known CSAM using hash-matching, with a review clause that could expand the scope if technology improves. Encrypted platforms would face obligations to report when CSAM is detected through user reporting or metadata analysis, but not through content scanning. The EU Centre for child sexual abuse prevention would coordinate cross-border referrals and maintain the hash databases. Whether this compromise can hold is uncertain. Law enforcement agencies across Europe have lobbied heavily for broader scanning, arguing that encrypted messaging is the primary distribution channel for abuse material and that excluding it renders the regulation largely symbolic. Privacy advocates argue that any mandatory scanning infrastructure, once built, will inevitably be expanded to other categories of illegal content, a slippery slope that the ECHR ruling in Podchasov was designed to prevent.

The honest assessment is that Europe has not resolved the tension between child safety and privacy because the tension may not be resolvable through regulation alone. The tools that would protect children, scanning messages for abuse material, verifying ages before granting access, monitoring interactions for grooming patterns, all require surveillance capabilities that EU law exists to prevent. The member states that have moved unilaterally with age bans have done so without a credible enforcement mechanism. The Commission's age verification technology failed its first public test. The Parliament killed the one legal instrument that allowed voluntary scanning. And the regulation that was supposed to replace it all remains, after four years of negotiation, a document that nobody can agree on because the two things it is trying to protect, children's safety and everyone's privacy, demand opposite things from the same infrastructure.