"Chat Control": An explanation of the EU's controversial legal proposal for scanning private messages to detect CSAM (Child Sexual Abuse Material).

For years, the European Union has been synonymous with tough privacy legislation.
"Chat Control": An explanation of the EU's controversial legal proposal for scanning private messages to detect CSAM (Child Sexual Abuse Material).

For years, the European Union has been synonymous with tough privacy legislation. But the legislative proposal to battle child abuse-the bloc laid it out formally as far back as May 2022-is threatening to downgrade the privacy and security of hundreds of millions of regional messaging app users.

According to the European Commission, the EU legislative body which presented the proposal, it is a package that will protect children's rights when online by checking the mishandling of mainstream technology tools on the part of child abusers, who allegedly use a lot of messaging applications to post their child sexual abuse material and even gain access to new victims.

Perhaps as a result of lobbying from the child safety tech sector, the EU approach taken is techno-solutionist. The Commission approach begins with the regulation of digital services - mainly messaging apps - by virtue of putting legal obligations on them to use technology tools to scan their users' communications with a view to detecting and reporting illegal activity.

For some two decades now, mainstream messaging apps have had the temporary derogation under the bloc's ePrivacy rules governing the confidentiality of digital communications-the derogation runs until May 2025, in its last extension-so they can elect to scan people's communications for CSAM in some scenarios.

The child abuse regulation would set permanent rules compelling AI-based content scanning throughout the EU.

Critics say that this would position messaging services awkwardly in having to implement imperfect technology to scan private user correspondence by default, an exercise that would lead to catastrophic consequences on people's privacy. It would also place the EU on a collision course with strong encryption, because the law would force E2EE apps to degrade the security to comply with demands to screen content.

Matters are so serious that the bloc's data protection supervisor warned a year ago that it represents a "tipping point" for democratic rights. The European Council's own legal advice service thinks it's incompatible with EU law, according to a leak of that assessment. In any event, EU law does prohibit the imposition of a general monitoring obligation, so if this law passes, it is almost bound to face legal challenge.

So far, the EU's co-legislators haven't agreed on how to continue on the file. But the draft law is on the table -- and so are all the risks with it.

Comprehensive CSAM detection orders
The original Commission proposal provides that platforms will scan messages from people on receipt of a detection order, not just for known CSAMs (previously identified and hashed for detection) but also unknown CSAMs (new images of abuse). This would further heighten the technical challenge of detecting illegal content with high accuracy and low false positives.

Another feature in the Commission's proposal is forcing platforms to identify grooming activity in real time. That is to say, in addition to examining what imagery the users are uploading for CSAM, apps would have to parse the contents of users' communications to try to understand when an adult user might be trying to entice a minor to commit sex acts.

In fact, automated tools identifying general communication between app users that could indicate activity predicting abuse open tremendous opportunities for innocent banter to be misidentified. Putting it all together, sweeping CSAM detection demands proposed by the Commission classify mainstream messaging services as mass surveillance tools, critics of the initiative say.

"Chat control" is the best head they were able to come up with to cram in the fears that the EU is enacting a law that mandates blanket scanning of private citizens' digital messaging-yes, even scouring text exchanges of people sending messages.

What about end-to-end encryption?

The initial draft of Commission's proposal for a regulation that aims to curb child sexual abuse does not exempt E2EE platforms from the demands to detect CSAM either.

And it's plain that, since E2EE means such services cannot access readable versions of users' communications — because they do not hold encryption keys — secure messaging services would present a particular compliance headache if they were ever legally obliged to comprehend content they are incapable of seeing.

Critics of the plan, therefore, argue that it will compel E2EE messaging platforms to degrade flagship security protections offered by performing such dangerous technologies as client-side scanning as a compliance measure.

The Commission's proposal doesn't involve specific technologies for CSAM detection that platforms should deploy. It will offload decisions to an EU center for countering child sexual abuse, which the law would establish. But the experts predict it would most likely be used to force adoption of client-side scanning.

Another is that those who implement robust encryption might just pull out the service in the country as a whole; Signal Messenger, for example, has already threatened to exit a market rather than have its security compromised at the behest of law. This may be the first step on a road that leaves people in the EU cut off from mainstream apps that apply gold standard E2EE security protocols to protect digital communications, such as Signal or Meta-owned WhatsApp, or Apple's iMessage, for example.

None of the measures the EU has drafted would do the job as envisioned, detractors of the proposal argue. But the ruinous knock-on effects for app users are what they anticipate: it will compromise millions of Europeans' private communications to inefficient scanning algorithms.

That could trigger scores of false positives, they said-those millions of innocent people wrongly implicated in suspect activity, flooding law enforcement with a pipeline of false reports.

The system the EU's proposal foresees would expose citizens' private messages to third parties that would be involved in checking suspicious content reports sent to them by platforms' detection systems. So, of course, if that particular flagged item wasn't forwarded to law enforcement for investigation-it was deemed non-suspicious at some point earlier in the reporting chain-it would nonetheless have been seen by someone other than the sender and their intended recipient/s. So RIP, comms privacy.

It would also pose an ongoing security challenge since personal communications exfiltrated from other platforms would likely be secured, with a possibility that third parties dealing with content reports might have poor security practices applied; this could lead to potentially further exposure of reported content.

People use E2EE for a reason, and having a bunch of middlemen touch your data is right up there.

Well, that would be a hell of a scary plan.
Normally, EU law-making is a three-way process, whereby the Commission tables legislation and its co-legislators, in the European Parliament and Council, work with the bloc's executive to try to come to a compromise they can all agree on.

In the case of the child abuse regulation, however, institutions of the EU so far have had decidedly differing views on the proposal.

A year ago lawmakers in the European Parliament agreed their negotiating position by proposing major changes to the Commission's proposal. From all sides of the parliament, significant changes were agreed that aimed to reduce the rights risks — including support for a complete carve-out for E2EE platforms from scanning requirements.

They also recommended restricting scanning to make it a much more focused activity: Add a proviso that the scanning should only happen on the messages of persons or group of persons suspected of child sexual abuse that is rather than the law imposing blanket scanning on all its users once a platform is served with a detection order.

Another measure MEPs adopted would limit detection to CSAM that is known and unknown, nixing a requirement that platforms also detect grooming activity by filtering text-based communications.

In the parliament's version of the proposal, it also urged other forms of measures such as mandating requirements on platforms to upgrade user privacy protections by defaulting profiles to non-public to reduce the chances of minors becoming easily detectable by predatory adults.

In general, it has an air of far greater balance compared with the Commission's original plans. However, since then, EU elections have revised the balance within the parliament. The perspectives of the new intake of MEPs are less clear.

There is still also the question of what the European Council will do-the body composed of representatives of member states' governments-and it has still not agreed a negotiating mandate on the file. Discussions with parliament cannot, therefore, have started.

Any user opting for privacy would be downgraded to a very basic dumb-phone feature set, with only text and audio capability. Indeed, that's literally what regional legislators have talked about.

The council ignored pleas from MEPs last year to fall in with their compromise. Instead, member states seem to prefer a position that's much closer to the commission's "scan everything" original. But there are also divisions between member states over how to do it. And so far, enough countries have objected to compromise texts they're presented with by the Council presidency to agree a mandate.

Leaked proposals from Council discussions suggest that governments of the member states remain keen to maintain blanket-scanning ability. However, compromise text from May 2024 tried to fiddle with how it was going to be presented by euphemistically covered terms like the "upload moderation" requirement in place across messaging platforms.

That prompted public intervention from Signal president Meredith Whittaker, who claimed EU lawmakers were indulging in "rhetorical games" as part of a ploy to squeeze out just enough votes to support the mass scanning of citizens' comms. That's something she warned in no-nonsense tones would "fundamentally undermine encryption.".

The leak to the press at the time was reported also to suggest asking users of messaging apps to agree to its scanning content, though key features of their app would be disabled if they did not agree-users would not be able to send images or URLs.

Under that scenario, users of EU messaging apps would be forced to choose between their privacy and a modern messaging app experience. Privacy-seeking individuals would be downgraded to a set of features more typical of a dumbphone: text and audio only. Yes, that is really what regional lawmakers have been considering.

More recently, though, there are signs support may be weakening within the Council to push for mass surveillance of citizens' messaging. Earlier this month Netzpolitik reported on the Dutch government stating it would abstain on another modified compromise, citing the impact that would have on E2EE, along with security risks posed by client-side scanning.

Another item about the regulation was withdrawn from the Council agenda last week as well. The news considers this step apparently taken because of the lack of qualified majority.

However, there are a big number of EU countries that still support the pressure from the Commission to implement blanket scanning of messages. And the current Hungarian Council presidency seems to be willing to continue trying to find a compromise. Therefore, the risk hasn't dissipated yet.

Member states might still hammer out a version of a proposal that amply satisfies enough of their governments to open the door to talks with MEPs, which would put everything up for grabs in the EU's closed-door trilogue discussions process. So the stakes for European citizens' rights — and the bloc's reputation as a champion of privacy — remain high.

Blog
|
2024-10-13 19:32:56