What Cops Really Want
The Illiberal Principles of Policing and Intelligence Gathering in the Digital Age
by Mallory Knodel
It's 2024, and the world is talking about encryption more than ever. Why?
Because privacy and public interest advocates, whistleblowers, and everyday users have made the case for ubiquitous network transport encryption, end-to-end encrypted messaging, and zero-access storage. Those rousing achievements have inevitably caused backlash, sometimes greater than the achievements themselves.
This situation leaves users in an arguably more precarious position when some services cooperate with cops and others don't. People want to use services that respect privacy, but they don't know which ones to trust.
While much discussion of protecting encryption focuses on law and policy, technical proposals truly spark the imagination of lawmakers. A belief that innovation has presented new tech solutions drives the current debates. Thus it is the technical community and service providers that are literally setting the standard for strong - or broken - encryption.
Understanding these technical proposals is essential to understanding the future policy landscape, and the implications of a world in which people cannot whisper without someone surveilling them. This piece is a guide, in three parts, to what cops really want, how companies propose to provide it, and what we can do to stop it.
First, let's break down exactly what law enforcement and intelligence agencies (cops) are asking for.
We've heard about encryption backdoors for a long time, so how do new proposals fit within this old discussion? What is new? Then I want to actually try to explain as best I can, in clear language, the technical proposals and their implications.
Finally, I offer my analysis of how good these proposals are at achieving their goal, whether they break encryption, and what else we should be worried about.
Back Doors Are Just the Beginning
To begin, those at the front lines for privacy in the Crypto Wars are seeing a change.
Previously, debates about encryption have centered on "backdoors," or exceptional access, which is just a metaphor for surveillance agencies being able to decrypt encrypted data - to spy on messaging content as it flows by. This overly simple understanding of privacy infringement no longer suffices, for reasons I'll explain, but it still holds lawmakers and privacy advocates bound in the dangerous belief that breaking encryption is a simple procedure to take encrypted content and decrypt it. By contrast, we are now seeing a proliferation of requests by intelligence and law enforcement agencies. They are no longer satisfied with the possibility of mere backdoor access, granted by a warrant, to a message here or there - they want the full firehose of user data.
Warrants are not the preferred tools of investigators. Obtaining data legally as compelled by a warrant is a consideration for prosecutors, not detectives. Cops are going further.
For example, law enforcement in India want something called "traceability," which is really just enhanced metadata. A good encrypted messaging app or service would be inclined to reduce metadata - information like the who, where, and when of a message - so that the service provider can improve user privacy, minimize attack surface, and limit their liability as an intermediary.
But this metadata minimization is at odds with law enforcement interests in compelling intermediaries to turn over server logs. Maybe an agency has received a tip containing the message contents and wants to follow up with the service provider about tracing the content back to its origin. Some go so far as to imagine they can know, from mere metadata, who has received that same message through forwarding and group chats. Nowhere in a proposed traceability scheme that I've seen has there been mention of decrypting encrypted content, yet traceability is incompatible with end-to-end encryption.
Indeed, they are also interested in enhanced metadata preservation, particularly when they're working in the late stages of an investigation when verifiable data can be used in a court to help put people behind bars. They are also asking for access in perpetuity, to aid in ongoing or future investigations. Any detective's imagination would immediately go to the possibilities afforded if accessing past messages were possible, undoubtedly helpful in criminal investigations. Because, of course, a judge's backdoor warrant has to be specific: you aren't decrypting just one random message here or there.
So if the target of a request is not just one message in an exchange but more, this situation creates a new issue from a technical perspective. As mentioned before, most secure messaging services are motivated to anonymize or de-link people from the messages they send, except what's necessary for their assured delivery. In effect, this new request requires them to link messages and message content vis-à-vis the user account.
What About Forward Secrecy?
Linkability requests from law enforcement violate a technical design constraint called forward secrecy.
Also referred to as forward security, this means, for example, that if you have access to my messages today, you won't necessarily have access to those from last year. A request like this goes beyond decrypting one message here or there. It doesn't just violate best practices for privacy and security online; it essentially adds a new feature - linkability - that is out of scope for a messaging service that is differentiated by its commitment to privacy.
A service built to keep messages between people confidential is now being built and developed not to improve user experience and confidentiality, but to enhance and extend the reach of the intelligence community and law enforcement.
An end-to-end encrypted messaging service provider could one day be forced to show law enforcement all of the contacts of a user under suspicion for crimes, no backdoor key required. While such a request doesn't involve decrypting the content of the encrypted messages, contact-tracking could help law government agencies to map our social networks in detail.
It's Censorship, Too
Finally, not only are government agencies trying to get access to private messages, they are trying to prevent certain types of content from being shared online in the first place.
Blocking, filtering, and censoring techniques have been proposed for both end-to-end and zero-access encryption schemes. None of the schemes, as it turns out, has figured out a way to technically prohibit censorship to only objectionable content like child abuse imagery while leaving out memes, intellectual property, political content, or any other target of sufficiently motivated corporate or government censors.
Now that we've established that "backdoor" access has evolved into a whole suite of feature requests, we can turn to proposed technical architectures that dance around the ends of end-to-end encryption without actually decrypting anything.
For example, client-side scanning has been proposed as an "alternative" to an encryption backdoor. In this process, you essentially turn a given device into the site of interception, using the encrypted messaging application itself. To picture this, first we accept that messaging services' encryption is end-to-end, which means the sender and the recipient are the two endpoints.
The service itself cannot access their conversation, and neither can any other intermediary. Client-side scanning, by contrast, essentially takes that design constraint and subtly moves the end point to the application, e.g. the sender and receiver's client. The application, or even the device, then computes and reveals message contents before or after it has been sent.
One proposal suggests leaving encrypted data intact while performing tasks such as hash matching or detecting abuse patterns with what are called "homomorphic" computations. Another involves the use of secure enclaves, creating a secure intermediary that does not disrupt the integrity of end-to-end encryption nor allow access to the encrypted messages.
Processing might occur on a third-party server, positioned between the communication endpoints but without accessing the encrypted content directly. This method allows computations on still-encrypted content, preserving the encryption and avoiding access to the encryption path. Mathematically, this technique could be used to compare known content against what is being transmitted.
What nearly all of these proposed features focus on is avoiding decryption of an encrypted message. Yet they all imply interception of some form.
Before cryptography was applied to secure messaging, interception of content was restricted, by warrant, as a principle of democracy to protect privacy and commerce. At its best, a secure messaging system deploys end-to-end encryption in a context where interception is still exceptional. But what we see is that encryption in the digital age has hastened and expanded the powers of interception.
Not only that, applications that are designed to keep us private and secure are the very same systems now being asked to stretch their scope beyond their designs for the express purpose of betraying users and their rights to privacy.
Misguided Policy
So far, we've discussed technological proposals for intercepting and decrypting encrypted content, but we should be very concerned about the legal and policy elements that have been proposed and their impacts on civil society. The societal questions that should be addressed include:
- What are the effects on the legal landscape, from local policy to international human rights, that are being undermined and eroded by these proposals?
- What are the normative and policy impacts on network operators, service providers, and corporate intermediaries caught between protecting children and user privacy?
- What are the changes proposed implying for victims, victim outreach, and social service provision?
In a world where government-mandated backdoors exist and the right to whisper online is prohibited, the proposed technical and policy changes would negatively impact privacy differentials between implementations of broken end-to-end encryption for communications and messaging across society at large.
The proposed changes would affect the privacy of messaging and communications, both technically and culturally in ways and at a scale that has not yet previously been realized. These changes might create a chilling effect for those who deserve privacy the most, at best, and at worst would forever change civil discourse with long-term effects on society.
Endnotes
David Correia and Tyler Wall Police: A Field Guide Verso. March 2018.
Breaking Encryption Myths What the European Commission's leaked report got wrong about online security. Global Encryption Coalition
Definition of End-to-End Encryption Mallory Knodel, et al.
Outside Looking In: Approaches to Content Moderation in End-to-End Encrypted Systems
Mallory Knodel is the chief technology officer for the Center for Democracy and Technology and a steering committee member of the Global Encryption Coalition. She is the chair of the Human Rights Protocol Considerations research group of the Internet Research Task Force and an advisor to the thirty-eight governmental members of the Freedom Online Coalition on emerging technologies and cybersecurity. She is a consultant and advisor for start-ups, non-profits, and funders on technology issues that matter to the public interest. She has used free software throughout her professional career and is a certified information systems security professional. She holds a BS in physics and mathematics and an MA in science education.