Wir brauchen Ihre Unterstützung — Jetzt Mitglied werden! Weitere Infos
What even dictators can only dream of
Chloé Berthélémy, zvg.

What even dictators can only dream of

The European Union is encouraging tech companies to uncover criminal acts for them. This opens the door for authoritarian control mania.

Lesen Sie die deutsche Version hier.

Technologically, encryption is a modern norm. We access our favourite websites protected by HTTPS. We catch up with loved ones far away on WhatsApp and chat with our friends over Signal messages. We order new clothes, furniture and food online via secure encrypted payment systems. People, businesses, and governments rely on encryption to safeguard their privacy, data and resources.

However, European governments and institutions have struggled to find a consistent position on the technological and practical reality of encryption methods. While the European Commission acknowledged in its 2017 Security Union report that «encryption is essential to ensure cybersecurity and the protection of personal data», it also described encryption as a major threat to the detection, investigation, and prosecution of crimes.1 Terrorist attacks or acts of organized and serious crimes have sparked in turn repeated calls from governments to find a way around encryption, urging the Commission to find an EU-wide solution. For instance, after the 2015-2016 terrorist attacks in Europe, the French and German Ministers of Interior met in Paris and called for legislation that would force companies to weaken encryption standards to intercept «Islamist extremists» messages.2 The pressure is equally high but ambivalent coming from the police community: It claims to support «strong encryption» but opposes «unregulated encryption», without specifying what distinguishes the two, as a key instrument facilitating crime and preventing law enforcement authorities from accessing evidence during criminal investigations.3

The unsolvable puzzle

Amidst this fuzziness of political will, the European Commission is facing a seemingly unsolvable puzzle: Undermining just one component of encryption would effectively compromise entire digital systems. If governments mandate the introduction of vulnerabilities, be it in the encryption algorithm, the key management (for example the creation of a master key), or any other components, it puts the security of all users at risk, not solely in the European Union but worldwide. Furthermore, it enables unlawful access by malicious actors who can likewise exploit these security gaps. Lastly, this agenda stands at odds with the EU’s own data protection and privacy standards.

Nevertheless, the Commission has tried, with the help of the member states, to identify «legal and technical measures» in order to give access to encrypted data «with minimum impact on fundamental rights».4 While desperately seeking to square the circle, they continued expanding training and resources for police, notably by funding Europol’s, the EU’s police cooperation agency, decryption platform. The last straw came in 2019, when Facebook (now Meta) announced it would soon deploy end-to-end encryption across its instant messaging services, meaning a potentially significant number of cases of child sexual abuse material (CSAM), normally reported to authorities, would no longer be detectable. Since reports of online child abuse had steadily increased during the last decade, this announcement deeply alarmed law enforcement and child protection organisations. Due to the subsequent political pressures, Facebook said it will delay the deployment of critically needed encryption until 2023.

Putting companies in charge

The fight against CSAM has become the main driver of the debate on encryption in Europe, stimulating the search for solutions beyond traditional «backdoors», which would allow a third party to bypass encryption and gain access to a system hosted by a tech company. In 2020, the European Commission released its Strategy for a more effective fight against child sexual abuse, pointing to the use of encryption as making the identification of perpetrators «more difficult if not impossible».5 It announced the introduction of an expert process with the industry, responsible to assess technical solutions that would allow companies themselves to «detect and report child sexual abuse in end-to-end encrypted electronic communications». This new focus on tasking private providers to come up themselves with a solution was confirmed in a resolution adopted by EU ministers a few months later.

In 2021, EU policymakers kept their attention close on companies’ scanning and filtering capacities as they realised an update to EU’s privacy legislation would have forbidden them from snooping their users’ private communications. The EU therefore rushed to adopt an interim legislation in 2021, allowing companies to continue their voluntary filtering practices. Now, a permanent legislation is in the making to replace the provisional one, involving this time an obligation to scan private communications. During the summer of 2021, Apple announced its intent to scan all images sent by child accounts and all photos as they are being uploaded to the iCloud service to support the fight against CSAM. Even though Apple was forced to backtrack its plans after public outcry of civil society, the episode has convinced legislators that control through company power is possible.

Ripple effect on freedoms

One of the main filtering solutions put forward as «respecting privacy» by the Commission, called «client-side scanning» (CSS) because the analysis takes place directly on the user’s device, poses serious risks: Firstly, it torpedoes one fundamental principle of end-to-end encryption, whereby only the sender and the recipient are able to read messages. Since the tool can make personal devices searchable on a mass scale, it affects everyone’s privacy and security, including children’s. This bulk scanning of people’s private data, without a court order or individual suspicion, is likely to infringe EU law because of the disproportionality of the inference with privacy and the chilling effect of surveillance on other freedoms.

Secondly, the temptation to expand the technology to other types of content is high. As soon as the argument on child protection has been won, it is a piece of cake for politicians to argue that other horrible crimes should be covered, like terrorism. However, terrorist speech and other harmful content often imply legal ambiguities that an automated scanning system is incapable of apprehending. Wrongful takedowns and abusive censorship are already commonplace because of online platforms’ moderation tools.6 The interference with fundamental rights will only intensify if the technology targets, intentionally or not, the communications of journalists and human rights defenders – especially if the mandatory detection of illegal content comes with a reporting obligation to the authorities. Once the system is in place, it will be a windfall for authoritarian governments inside and outside the EU. Like any other technical tool, it can be misused if falling in unaccountable and ill-intentioned hands. Affected users and civil society should engage in this conversation right now – before it is too late.

»
Abonnieren Sie unsere
kostenlosen Newsletter!