Columns > The Bikeshed


      view issue

More Encryption Is Not the Solution
Download PDF version of this article

by Poul-Henning Kamp | July 30, 2013

Topic: Privacy and Rights

  • View Comments
  • Print

More Encryption Is Not the Solution

Cryptography as privacy works only if both ends work at it in good faith


Poul-Henning Kamp


The recent exposure of the dragnet-style surveillance of Internet traffic has provoked a number of responses that are variations of the general formula, "More encryption is the solution." This is not the case. In fact, more encryption will probably only make the privacy crisis worse than it already is.

Inconvenient Fact #1 about Privacy

Politics Trumps Cryptography

Nation-states have police forces with guns. Cryptographers and the IETF (Internet Engineering Task Force) do not.

Several nation-states, most notably the United Kingdom, have enacted laws that allow the police to jail suspects until they reveal the cryptographic keys to unlock their computers. Such laws open a host of due process and civil rights issues that we do not need to dwell on here. For now it is enough to note that such laws can be enacted and enforced.

Inconvenient Fact #2 about Privacy

Not Everybody Has a Right to Privacy

The privacy of some strata of the population has been restricted. In many nation-states, for example, prisoners are allowed private communication only with their designated lawyers; all other communications must be monitored by a prison guard.

Many employees sign away most of their rights to privacy while "on the clock," up to and including accepting closed-circuit TV cameras in the company restrooms.

Any person can have the right to privacy removed through whatever passes for judicial oversight in their country of residence, so that authorities can confirm or deny a suspicion of illegal activities. People in a foreign country may not have any right to privacy. Depriving them of their privacy is called "espionage," a fully legal and usually well-funded part of any nation-state's self-defense mechanism.

Inconvenient Fact #3 about Privacy

Encryption Will Be Broken, If Need Be

This follows directly from the first two points: if a nation-state decides that somebody should not have privacy, then it will use whatever means available to prevent that privacy. Traditionally, this meant intercepting mail, tapping phones, sitting in a flowerbed with a pair of binoculars, installing "pingers," and more recently, attaching GPS devices to cars.

Widely available, practically unbreakable cryptography drastically changed the balance of power, and the 9/11 terrorist attack in New York City 12 years ago acted as a catalyst throughout the world for stronger investigative powers that would allow plans for terrorist activity to be discovered before they could be carried out.

Skype offers an interesting insight into just how far a nation-state is willing to go to get past encryption. Originally, Skype was a peer-to-peer encrypted network, and although the source or encryption scheme was never made available for inspection, it was assumed to be pretty good. Then something funny happened: eBay bought Skype for a pile of money with some vague explanation about allowing buyers and sellers to communicate directly.

To me, as an experienced eBay user, that explanation didn't make any sense at all, certainly not for the kinds of goods I usually purchase—such as vintage HP instrumentation. I assumed, however, that other user segments—perhaps stamp collectors or garden-gnome aficionados—had different modes of trading.

Then some weird rumors started to circulate: eBay had bought Skype without the source code and regretted the purchase. There seemed to be something to those rumors, because eBay sold Skype back to the founder, for a lot less money.

Head scratching now became a serious risk of baldness for people trying to keep track, because then Microsoft bought Skype for a pile of money, and this time the purchase included the source code. Then Microsoft changed the architecture: it centralized Skype so that all Skype conversations would go through a Microsoft server somewhere in the world. At this point human rights activists who had relied on Skype for a clear channel out of oppressive regimes started to worry.

Some may speculate that the disclosures by former NSA (National Security Agency) contractor Edward Snowden seem to support the theory that Microsoft bought Skype to give the NSA access to the unencrypted conversations through Skype, although we don't know if that's the case, nor what NSA paid for Microsoft's assistance if so.

With expenditures of this scale, there are a whole host of things one could buy to weaken encryption. I would contact providers of popular cloud and "whatever-as-service" providers and make them an offer they couldn't refuse: on all HTTPS connections out of the country, the symmetric key cannot be random; it must come from a dictionary of 100 million random-looking keys that I provide. The key from the other side? Slip that in there somewhere, and I can find it (encrypted in a Set-Cookie header?).

In the long run, nobody is going to notice that the symmetric keys are not random—you would have to scrutinize the key material in many thousands of connections before you would even start to suspect something was wrong. That is the basic problem with cryptography as a means of privacy: it works only if both ends work at it in good faith.

Major operating-system vendors could be told to collect the keys to encrypted partitions as part of their "automatic update communication," and nobody would notice that 30-40 extra random- looking bytes got sent back to the mother ship. That would allow any duly authorized officer of the law simply to ask for the passwords, given the machine's unique identifier. That would be so much more efficient and unobtrusive than jailing the suspect until he or she revealed it. For one thing, the suspects wouldn't even need to know that their data was under scrutiny.

Building backdoors into computing devices goes without saying. Consider the stock-quote application for my smartphone, shown in figure 1. I can neither disable nor delete this app, and it has permission to access everything the phone can do.

No, I don't trust my smartphone with any secrets.

You could also hire a bunch of good programmers, pay them to get deeply involved in open source projects, and have them sneak vulnerabilities into the source code. Here is how the result could look:

In September 2006, somebody pointed out that Valgrind complained about a particular code line and managed to get it removed from the Debian version of OpenSSL. Only two years later did somebody realize that this reduces the initial randomness available to the cryptographic functions to almost nothing: a paltry 32,000 different states.1

As spymaster, I would have handed out a bonus: weakening cryptographic key selection makes brute-force attacks so much more economical.

Open source projects are built on trust, and these days they are barely conscious of national borders and largely unaffected by any real-world politics, be it trade wars or merely cultural differences. But that doesn't mean that real-world politics are not acutely aware of open source projects and the potential advantage they can give in the secret world of spycraft.

To an intelligence agency, a well-thought-out weakness can easily be worth a cover identity and five years of salary to a top-notch programmer. Anybody who puts in five good years on an open source project can get away with inserting a patch that "on further inspection might not be optimal."

Politics, Not Encryption, Is the Answer

As long as politics trumps encryption, fighting the battle for privacy with encryption is a losing proposition. In the past quarter century, international trade agreements have been the big thing: free movement of goods across borders and oceans, to the mutual benefit of all parties.

I guess we all assumed that information and privacy rights would receive the same mutual respect as property rights did in these agreements, but we were wrong.

We can all either draw our cloud services back home or deal only with companies subject to the same jurisdiction as us—insist on "Danish data on Danish soil," and so on. This already seems to be a reflex reaction in many governments-there are even rumors about an uptick in sales of good old- fashioned typewriters. That will solve the problem, but it will also roll back many of the advantages and economic benefits of the Internet.

Another option is to give privacy rights the same protection as property rights in trade agreements, up to and including economic retaliation if a nation-state breaks its end of the bargain and spies on citizens of its partner countries. This is not a great solution (it would be hard to detect and enforce), but it could sort of work.

The only surefire way to gain back our privacy is also the least likely: the citizens of all nation- states must empower politicians who will defund and dismantle the espionage machinery and instead rely on international cooperation to expose and prevent terrorist activity.

It is important to recognize that there will be no one-size-fits-all solution. Different nation- states have vastly different attitudes to privacy: in Denmark, tax forms are secret; in Norway they are public; and it would be hard to find two nation-states separated by less time and space than Denmark and Norway.

There will also always be a role for encryption, for human-rights activists, diplomats, spies, and other "professionals." But for Mr. and Mrs. Smith, the solution can only come from politics that respect a basic human right to privacy—an encryption arms race will not work.

Reference

1. Schneier, B. 2008. Random number bug in Debian Linux. Schneier on Security blog; http://www.schneier.com/blog/archives/2008/05/random_number_b.html.

LOVE IT, HATE IT? LET US KNOW feedback@queue.acm.org

Poul-Henning Kamp(phk@FreeBSD.org) is one of the primary developers of the FreeBSD operating system, which he has worked on from the very beginning. He is widely unknown for his MD5-based password scrambler, which protects the passwords on Cisco routers, Juniper routers, and Linux and BSD systems. Some people have noticed that he wrote a memory allocator, a device file system, and a disk- encryption method that is actually usable. Kamp lives in Denmark with his wife, his son, his daughter, about a dozen FreeBSD computers, and one of the world's most precise NTP (Network Time Protocol) clocks. He makes a living as an independent contractor doing all sorts of stuff with computers and networks.

© 2013 ACM 1542-7730/13/0700 $10.00

acmqueue

Originally published in Queue vol. 11, no. 7
see this item in the ACM Digital Library

Back to top

Comments

Displaying 10 most recent comments. Read the full list here
  • vittunaama | Thu, 01 Aug 2013 09:31:35 UTC

    I don't understand the utility of using known symmetric keys, since TLS is a key agreement protocol that uses a PRF to derive the key from the nonces selected by both parties. It could work for other protocols, though.  
  • Max Kington | Thu, 01 Aug 2013 09:53:44 UTC

    Declaring an interest I've designed (tried to though it is implemented) a protocol which allows the wider network to discover if it's being gamed. You're right it's not if it's broken but when, but it can at least be detected. Better yet if it's risky to coerce the developers you put off those who'd try.  http://talariachatapp.wordpress.com/2013/07/29/being-unpopular/
  • anon | Thu, 01 Aug 2013 10:42:23 UTC

    What is a "pinger" device which PHK refers to in the article?
  • Poul-Henning Kamp | Thu, 01 Aug 2013 18:22:55 UTC

    A "pinger" was a small radio-transmitter which sent out a "ping....ping...ping..." signal.  There was a famous case in US-supreme court, where covert police offered a suspect a can of $random_product as a "promotion" and used the radio-transmitter it contained to track his car by radio-triangulation.
    
    I havn't read that specific case, but I belive the SCOTUS decided that since he had accepted the "gift"voluntarily, it was constitutionally OK.
    
    Recently I belive SCOTUS held that slapping a GPS-tracker on a suspects car required a court order, the crucial difference being that the suspect was not involved and did not consent.
  • Longpoke | Sun, 04 Aug 2013 02:53:41 UTC

    -----BEGIN PGP SIGNED MESSAGE-----
    Hash: SHA1
    
    context: comment on the article "Columns > The Bikeshed - More Encryption Is Not the Solution" (by Poul-Henning Kamp on July 1, 2013) (http://queue.acm.org/detail.cfm?id=2508864) on http://queue.acm.org (ACM Queue)
    
    "on all HTTPS connections out of the country, the symmetric key cannot be random; it must come from a dictionary of 100 million random-looking keys that I provide."
    
    Fine.
    
    "The key from the other side? Slip that in there somewhere, and I can find it (encrypted in a Set-Cookie header?)."
    
    What? I don't understand what this means or why this is part of your attack procedure. You already selected a finite set of keys that the service will use with the clients, so you can decrypt any traffic you intercept by simply running through this set of keys, without furthur information from or privileged access to the service.
    
    "In the long run, nobody is going to notice that the symmetric keys are not randomyou would have to scrutinize the key material in many thousands of connections before you would even start to suspect something was wrong."
    
    Wrong. A cryptographer performing a black box audit on a service would be quick to try various methods of cryptanalysis, especially testing the randomness of key material. Thousands is not a big number for computers or even the internet. And indeed, due to the birthday problem, it would only take thousands (not 100 million) of connections for me to discover that the service has duplicate keys, which would make me think its entropy source is broken. I'd then proceed to connect a few billion times using multiple IP addresses to record all the keys.
    
    But yes, your point that it is possible to bribe providers to give you a backdoor is true.
    
    "That is the basic problem with cryptography as a means of privacy: it works only if both ends work at it in good faith."
    
    Yes. This is why end-to-end encryption exists. Instead of putting your trust in a stupid little corporation, you encrypt your data to the recipient's private key, and the mail server or chat server simply can't see what's been said. Ironically, from the title of this article, I assumed it was going to be about some pitfall in end-to-end encryption, or about cascade ciphers...
    
    "Consider the stock-quote application for my smartphone, shown in figure 1. I can neither disable nor delete this app, and it has permission to access everything the phone can do."
    
    That's your own problem.
    
    "You could also hire a bunch of good programmers, pay them to get deeply involved in open source projects, and have them sneak vulnerabilities into the source code."
    
    This only works on *nix based operating systems because they are monotlihic and trust all code for no reason. More and more effort is now being put into operating systems which are both small enough to verify by one or a few people and give minimal privileges to programs. In fact, half a century ago, the capability security model was created, where you don't need to trust any code except the kernel, which is very small.
    
    Maybe this article should have been called "Linux/Windows Is Not the Solution", or "The cloud Is Not the Solution".
    
    "Politics, Not Encryption, Is the Answer"
    
    WTF?
    -----BEGIN PGP SIGNATURE-----
    Version: GnuPG v2.0.20 (GNU/Linux)
    
    iEYEARECAAYFAlH9rsAACgkQ3PGpByoQpZEAKACdHBtSFQ002Lk1zzQdCKm5sNZw
    rq8An2pq6u7HlfDY16ddP2iTX4aazKR0
    =m27J
    -----END PGP SIGNATURE-----
  • Poul-Henning Kamp | Wed, 21 Aug 2013 23:18:44 UTC

    I have specifically waited a little while before I replied to you longpoke, I thought a good example of my point would show up soon.
    
    It did.
    
    What good does any encryption do you, if you get detained and told to hand over your passwords or be thrown in jail ?  http://www.bbc.co.uk/news/uk-23776243
    
    This is a political problem, not an encryption problem.
  • wtarreau | Sun, 25 Aug 2013 09:02:58 UTC

    Longpoke: "And indeed, due to the birthday problem, it would only take thousands (not 100 million) of connections for me to discover that the service has duplicate keys, which would make me think its entropy source is broken. I'd then proceed to connect a few billion times using multiple IP addresses to record all the keys."
    
    Have you ever done this for all the services you're currently using on the web ? If the response is "no", then you understand the problem of trust that we've been having with mandatory encryption forever.
    
    What I'm noticing as a user is that I'm getting more and more bad cert warnings. This never ever happened 10 years ago. Now there is a trend of enforcing https everywhere and many sites don't care enough about their certs, or use CAs that are not known in any-but-the-very-latest-browser, etc... In the end, I'm still finding myself to click on the certs details all the day but I know many people who blindly click on the proper buttons without even thinking about it. What we're doing is just to incite users to ignore security for non-important things and get used to this. We should only annoy the user when there is a compelling reason for doing so. It's the same principle as the noisy alarms we can hear all the day in large cities. Who cares anymore about a ringing alarm ? Once in a while it might be a real one though but it remains ignored or unnoticed... A good example how excess of security reduces security.
    
    Someone used the analogy with locks. I can go further : right now you have a lock on your door and this is required by your insurance company. If your doorlock doesn't work well, you'll run to the store to buy another one and replace it. Now imagine that your insurance company requires a properly working lock on each and every window and door *inside* your house in exchange for a much cheaper insurance price. You end up with 20 locks in your house that will constantly have to open and close when entering and leaving your house. They'll fail much more often and you'll get used to sometimes go outside with one of them not locked or not fully working, and will be used not to care much about it. Except the day someone comes into your house by breaking the main doorlock which was properly working, the insurance will not cover this because you had one faulty lock inside. The conclusion is : only enforce security where it really matters and leave the choice to users when it does not. It will avoid them getting used to false alarms.
    
    Last, the weaker point is always the clueless end user. You can't force them to understand their computer because this is complex science. We all know people around us using a completely infected windows PC with an outdate anti-virus configured to shut up because it is the way their PC works best *for them*. When you shout at them, they tell you that if they re-enable it, they can't consult their bank account online, they can't check their kid's photos without clicking on warnings all the time etc... These people don't need a secure operating system in the first place. They just need a browser to go to the net once in a while just like they open their radio. By adding many security features there we're making their experience too much complicated for them and they finally completely disable security to get what they need.
    
  • figj | Thu, 05 Sep 2013 11:25:46 UTC

    Encryption may not solve all privacy problems, but it does put a big whole in wholesale surveillance.
    
    The problem with "cloud" providers is that they are a man-in-the-middle accumulating massive amounts of data. So they are a sitting duck for the NSA - no warrants necessary, all data aggregated in one place and hugely simpler than trying to individually going after millions of end users.
    
    This is why encryption *is* important. Client-side encryption allows end-users to protect their own data (with encryption) and control who has access to it (with keys). Fundamentally, your data on someone else's disk isn't yours, but your encrypted data on someone else's disk isn't theirs either.
    
    Note that the essence here is file-level encryption controlled by the end-user. It isn't about link encryption (SSL) or storage encryption (disk) which is what most cloud providers tout. They do this to deflect attention from their business model which is to centralise all power and control "server-side".
    
    Thankfully, emerging companies (such as Lockbox www.lock-box.com) empower users to do all their own encryption/decryption and key management client-side. When all the keys and encryption/decryption is client-side, the cloud is "blind" to all (encrypted) data being stored and shared and thus removes all the usual cloud risks of hackers, rogue administrators, complicit third parties and prying governments. Using Lockbox as the example, the NSA is pretty stuffed - no keys (they are client-side), no ciphertext (users can store their data in any S3 server worldwide) and no access to the application (which comes out of Australia).
    
    
  • Longpoke | Sun, 17 Nov 2013 21:26:57 UTC

    Reply to paul and wtarreau here:
    http://longpoke.blogspot.ro/2013/11/follup-to-comments-on-article-columns.html
    
    This site or someone on the internet invalidates my signed messages.
  • Eduardo | Fri, 13 Dec 2013 21:00:27 UTC

    This and other problems in IT/CS could be eliminated or, at least, mitigated if every IT/CS "expert" reads RFC 1925 carefully, thinks about it and apply its wisdom.
Displaying 10 most recent comments. Read the full list here
Leave this field empty

Post a Comment:

(Required)
(Required)
(Required - 4,000 character limit - HTML syntax is not allowed and will be removed)