Plan to scan encrypted messages sparks backlash from tech firms and privacy experts
Image:
EU moves closer to controversial ‘Chat Control’ law
European policymakers are edging towards a decision that could force technology firms to scan encrypted messages in a bid to tackle child abuse material, reviving a debate that cuts to the heart of online privacy and security.
Law enforcement officials and national representatives will meet in Brussels today, Friday, 12th September, to discuss the latest draft of the so-called Chat Control regulation.
Denmark, which currently holds the rotating presidency of the EU Council, is pushing to bring the proposal to a vote by 14th October, despite fierce resistance from cryptographers, privacy advocates and some member states.
The current push comes after previous attempts by the EU Council to advance controversial online child protection legislation have faced serious opposition and delays.
The legislation would mandate that technology companies deploy scanning technologies on devices to examine messages, images and videos before encryption takes place. Artificial intelligence and machine learning systems would also be used to detect previously unknown abuse material.
Supporters call it protection, critics see a threat to civil liberties
Supporters, among them France, Italy, Spain and Sweden, argue that the regulation is necessary to close gaps that allow harmful content to circulate undetected on private messaging platforms. Denmark’s draft text stresses that nothing in the legislation should be interpreted as banning or weakening encryption, but insists that “vetted technologies” must be deployed to scan communications before they are scrambled into unreadable code.
Those on the opposing side, however, say that this is client-side scanning by another name. An open letter signed by 660 researchers this week, warned that the new proposal “completely undermines the security and privacy protections that are essential to protect the digital society.”
The signatories added that no machine learning algorithm has what it takes to distinguish abusive material without generating huge numbers of false positives.
“If Chat Control is passed, opaque AI algorithms will decide whether your personal messages and your private pictures will be flagged or leaked,” said Matthias Pfau, co-founder of encrypted email service Tuta. “This undermines online privacy for over 450 million EU citizens, which is something we must not accept.”
WhatsApp also criticised the proposals. The company previously threatened to withdraw from the UK over similar encryption-weakening proposals in the Online Safety Act.
Concerns over mass surveillance
Those opposing the move describe the scheme as a form of mass surveillance that risks treating hundreds of millions of Europeans as potential suspects.
Technology companies have responded with varying degrees of resistance to the proposed regulations. Signal had warned that it would pull its service from the EU rather than comply with mandatory scanning. Meanwhile, German encrypted email provider Tuta Mail has indicated it would pursue legal action against the EU if the proposals are adopted.
Apart from the technical limitations raised, Matthew Hodgson, CEO of Element told ComputerWeekly that the regulation is “fundamentally flawed” and warned that “undermining encryption by introducing a backdoor for lawful intercept is nothing other than deliberately introducing a vulnerability, and they always get exploited in the end.”
The concerns echo those raised in the UK when the government made provisions that would enable Ofcom to instruct digital platforms to deploy message scanning tools.
Divisions among member states
As of this week, 15 member states were reported to be backing the Danish proposal, with six yet to decide and six opposed. Germany remains a swing vote, reflecting the sharp divisions between governments under pressure to improve child protection and those alarmed by the risks of mass scanning.
While the Commission insists the regulation is about safeguarding children, the growing resistance from academics, civil society and parts of the tech sector suggests the road to consensus will be tough.
The debate now revolves around whether policymakers can reconcile public safety with the principle of secure, private communication or whether one of these will have to give way.