(newest first)

  • zz | Sat, 17 May 2014 00:29:44 UTC

    >What good does any encryption do you, if you get detained and told to hand over your passwords or be thrown in jail ?
    It will eventually do the same good that Rosa Parks refusing to give up her seat on the bus did, despite the fact that she was immediately arrested and hauled off to jail. If she had given up her seat and gone home to write letters to her senators and congressmen, her grandchildren might still be sitting in the back of the bus to this very day.
    The same good that all of Gandhi's protests did, despite the fact that there was always a troop of British soldiers right there to club the protesters down and haul him off to jail. If he had spent his days writing letters to the queen asking her for redress of his grievances, the British would have stayed until they had stripped India bare of every resource, right down to the top soil.
    This is not politics, this is a struggle. If you really think the national security state is going to give up the immense power it has accrued just because people ask them kindly to stop, you are a complete... I better not say it. If people resist and make it difficult for the government to get what they want without use of force, there will be injustices and outrage, and eventually we will win. If we go on dumping the entirety of our lives onto the cloud and signing online petitions asking them to please remember their oaths to protect the constitution, they will eventually chip away our rights until there is nothing left.
  • Bernd | Wed, 07 May 2014 21:12:39 UTC

    I dont think it is a good idea to repeat the thinking that "unlawful" cryptography was the reason 9/11 was not detected.
  • Eduardo | Fri, 13 Dec 2013 21:00:27 UTC

    This and other problems in IT/CS could be eliminated or, at least, mitigated if every IT/CS "expert" reads RFC 1925 carefully, thinks about it and apply its wisdom.
  • Longpoke | Sun, 17 Nov 2013 21:26:57 UTC

    Reply to paul and wtarreau here:
    This site or someone on the internet invalidates my signed messages.
  • figj | Thu, 05 Sep 2013 11:25:46 UTC

    Encryption may not solve all privacy problems, but it does put a big whole in wholesale surveillance.
    The problem with "cloud" providers is that they are a man-in-the-middle accumulating massive amounts of data. So they are a sitting duck for the NSA - no warrants necessary, all data aggregated in one place and hugely simpler than trying to individually going after millions of end users.
    This is why encryption *is* important. Client-side encryption allows end-users to protect their own data (with encryption) and control who has access to it (with keys). Fundamentally, your data on someone else's disk isn't yours, but your encrypted data on someone else's disk isn't theirs either.
    Note that the essence here is file-level encryption controlled by the end-user. It isn't about link encryption (SSL) or storage encryption (disk) which is what most cloud providers tout. They do this to deflect attention from their business model which is to centralise all power and control "server-side".
    Thankfully, emerging companies (such as Lockbox empower users to do all their own encryption/decryption and key management client-side. When all the keys and encryption/decryption is client-side, the cloud is "blind" to all (encrypted) data being stored and shared and thus removes all the usual cloud risks of hackers, rogue administrators, complicit third parties and prying governments. Using Lockbox as the example, the NSA is pretty stuffed - no keys (they are client-side), no ciphertext (users can store their data in any S3 server worldwide) and no access to the application (which comes out of Australia).
  • wtarreau | Sun, 25 Aug 2013 09:02:58 UTC

    Longpoke: "And indeed, due to the birthday problem, it would only take thousands (not 100 million) of connections for me to discover that the service has duplicate keys, which would make me think its entropy source is broken. I'd then proceed to connect a few billion times using multiple IP addresses to record all the keys."
    Have you ever done this for all the services you're currently using on the web ? If the response is "no", then you understand the problem of trust that we've been having with mandatory encryption forever.
    What I'm noticing as a user is that I'm getting more and more bad cert warnings. This never ever happened 10 years ago. Now there is a trend of enforcing https everywhere and many sites don't care enough about their certs, or use CAs that are not known in any-but-the-very-latest-browser, etc... In the end, I'm still finding myself to click on the certs details all the day but I know many people who blindly click on the proper buttons without even thinking about it. What we're doing is just to incite users to ignore security for non-important things and get used to this. We should only annoy the user when there is a compelling reason for doing so. It's the same principle as the noisy alarms we can hear all the day in large cities. Who cares anymore about a ringing alarm ? Once in a while it might be a real one though but it remains ignored or unnoticed... A good example how excess of security reduces security.
    Someone used the analogy with locks. I can go further : right now you have a lock on your door and this is required by your insurance company. If your doorlock doesn't work well, you'll run to the store to buy another one and replace it. Now imagine that your insurance company requires a properly working lock on each and every window and door *inside* your house in exchange for a much cheaper insurance price. You end up with 20 locks in your house that will constantly have to open and close when entering and leaving your house. They'll fail much more often and you'll get used to sometimes go outside with one of them not locked or not fully working, and will be used not to care much about it. Except the day someone comes into your house by breaking the main doorlock which was properly working, the insurance will not cover this because you had one faulty lock inside. The conclusion is : only enforce security where it really matters and leave the choice to users when it does not. It will avoid them getting used to false alarms.
    Last, the weaker point is always the clueless end user. You can't force them to understand their computer because this is complex science. We all know people around us using a completely infected windows PC with an outdate anti-virus configured to shut up because it is the way their PC works best *for them*. When you shout at them, they tell you that if they re-enable it, they can't consult their bank account online, they can't check their kid's photos without clicking on warnings all the time etc... These people don't need a secure operating system in the first place. They just need a browser to go to the net once in a while just like they open their radio. By adding many security features there we're making their experience too much complicated for them and they finally completely disable security to get what they need.
  • Poul-Henning Kamp | Wed, 21 Aug 2013 23:18:44 UTC

    I have specifically waited a little while before I replied to you longpoke, I thought a good example of my point would show up soon.
    It did.
    What good does any encryption do you, if you get detained and told to hand over your passwords or be thrown in jail ?
    This is a political problem, not an encryption problem.
  • Longpoke | Sun, 04 Aug 2013 02:53:41 UTC

    Hash: SHA1
    context: comment on the article "Columns > The Bikeshed - More Encryption Is Not the Solution" (by Poul-Henning Kamp on July 1, 2013) ( on (ACM Queue)
    "on all HTTPS connections out of the country, the symmetric key cannot be random; it must come from a dictionary of 100 million random-looking keys that I provide."
    "The key from the other side? Slip that in there somewhere, and I can find it (encrypted in a Set-Cookie header?)."
    What? I don't understand what this means or why this is part of your attack procedure. You already selected a finite set of keys that the service will use with the clients, so you can decrypt any traffic you intercept by simply running through this set of keys, without furthur information from or privileged access to the service.
    "In the long run, nobody is going to notice that the symmetric keys are not randomyou would have to scrutinize the key material in many thousands of connections before you would even start to suspect something was wrong."
    Wrong. A cryptographer performing a black box audit on a service would be quick to try various methods of cryptanalysis, especially testing the randomness of key material. Thousands is not a big number for computers or even the internet. And indeed, due to the birthday problem, it would only take thousands (not 100 million) of connections for me to discover that the service has duplicate keys, which would make me think its entropy source is broken. I'd then proceed to connect a few billion times using multiple IP addresses to record all the keys.
    But yes, your point that it is possible to bribe providers to give you a backdoor is true.
    "That is the basic problem with cryptography as a means of privacy: it works only if both ends work at it in good faith."
    Yes. This is why end-to-end encryption exists. Instead of putting your trust in a stupid little corporation, you encrypt your data to the recipient's private key, and the mail server or chat server simply can't see what's been said. Ironically, from the title of this article, I assumed it was going to be about some pitfall in end-to-end encryption, or about cascade ciphers...
    "Consider the stock-quote application for my smartphone, shown in figure 1. I can neither disable nor delete this app, and it has permission to access everything the phone can do."
    That's your own problem.
    "You could also hire a bunch of good programmers, pay them to get deeply involved in open source projects, and have them sneak vulnerabilities into the source code."
    This only works on *nix based operating systems because they are monotlihic and trust all code for no reason. More and more effort is now being put into operating systems which are both small enough to verify by one or a few people and give minimal privileges to programs. In fact, half a century ago, the capability security model was created, where you don't need to trust any code except the kernel, which is very small.
    Maybe this article should have been called "Linux/Windows Is Not the Solution", or "The cloud Is Not the Solution".
    "Politics, Not Encryption, Is the Answer"
    Version: GnuPG v2.0.20 (GNU/Linux)
    -----END PGP SIGNATURE-----
  • Poul-Henning Kamp | Thu, 01 Aug 2013 18:22:55 UTC

    A "pinger" was a small radio-transmitter which sent out a "" signal.  There was a famous case in US-supreme court, where covert police offered a suspect a can of $random_product as a "promotion" and used the radio-transmitter it contained to track his car by radio-triangulation.
    I havn't read that specific case, but I belive the SCOTUS decided that since he had accepted the "gift"voluntarily, it was constitutionally OK.
    Recently I belive SCOTUS held that slapping a GPS-tracker on a suspects car required a court order, the crucial difference being that the suspect was not involved and did not consent.
  • anon | Thu, 01 Aug 2013 10:42:23 UTC

    What is a "pinger" device which PHK refers to in the article?
  • Max Kington | Thu, 01 Aug 2013 09:53:44 UTC

    Declaring an interest I've designed (tried to though it is implemented) a protocol which allows the wider network to discover if it's being gamed. You're right it's not if it's broken but when, but it can at least be detected. Better yet if it's risky to coerce the developers you put off those who'd try.
  • vittunaama | Thu, 01 Aug 2013 09:31:35 UTC

    I don't understand the utility of using known symmetric keys, since TLS is a key agreement protocol that uses a PRF to derive the key from the nonces selected by both parties. It could work for other protocols, though.  
  • Lennie | Thu, 01 Aug 2013 09:12:53 UTC

    Encryption isn't a binary, one or nothing.
    Please use, as a technical community, set up transport security everywhere (encryption like HTTPS, SMTPS, IPSEC, SSH, SRTP, whatever).
    This will take away the dragnet.
    Sure, if the police has a warrant they can search your home, but they shouldn't be installing government camera's in every home.
  • Matt | Thu, 01 Aug 2013 00:43:49 UTC

    This is like saying "locking doors is not the solution" (to burglaries). And of course it won't work very well if burglary is legal. But locking doors that were formerly open, both physical and digital, is a reasonably cheap way for defenders to make life harder for attackers to operate. This is particularly true when some of the largest attackers, US government and allies, rely on unencrypted communications that are just passing through complicit communications providers to examine content without controlling the endpoints.
    We need political change to reign in snoops run amok. And we need technological change to make it harder for future super-secret courts and laws that might try to reverse gains made against this generation of secret courts and laws.
  • Joachim | Wed, 31 Jul 2013 17:59:28 UTC

    Dear sir/mdm,
    thanks for this great article! 
    Most technical people think that the solution is always another piece of software (or as some people say: "if you only have a hammer every problem will look like a nail"). 
    So I fully agree the solution can't be purely focused on more encryption. It would be much better putting more privacy lawyers in the same room with engineers and have them become more aware of each others requirements. 
    Here is a blueprint of how, from a legal perspective data can be spread across several jurisdictions, as well as separating the gatekeeper of that data with the firms which store the data both legally and physically:
    If one would daisy chain such gatekeeper's, or even spread individual bits across different firms in Iceland, Germany or other high privacy jurisdictions we could create an environment that would need to be battled against several jurisdictions before the data would be completely available.
  • anon | Wed, 31 Jul 2013 13:00:53 UTC

    But this is like saying that since it's not 100% successful all the time we shouldn't do it?  Same argument could be made about changing something in politics.  I think we should do both. Petition your senator and start campaigns to improve the laws and replace the crappy politicians.  At the same time make good tools that even regular people will bother to use, slowing down the surveillance machinery. There will be breaches and failures but they do offer some resistance. As a hardware engineer who makes code that is attacked by random bits from space all day I think the software people has to get their shit together. It's true you can't make perfect systems but you sure as hell can do much better than what we have now. And always remember: Violence can not solve math problems.
Leave this field empty

Post a Comment:

(Required - 4,000 character limit - HTML syntax is not allowed and will be removed)