back to article Mandatory HTTP 2.0 encryption proposal sparks hot debate

Most Internet Engineering Task Force (IETF) debates pass unnoticed, because they're very dry and detailed. However, a suggestion that the HTTP 2.0 specification might mandate encryption – in a post-Snowden world – is too tasty an idea to go under the radar. The suggestion sparking the debate came from HTTPbis chair, Mark …

COMMENTS

This topic is closed for new posts.
  1. Combat Wombat

    SSL..

    Is already broken, so I would wager that any sort of NSA complaining is more professional theater than anything else.

    Anything truly unbreakable would be illegal.

    1. FrankAlphaXII

      Re: SSL..

      I dont know about illegal, but definitely classified and not in very wide use even at DoD, the other executive departments or NSA/CSS except in very specific cases (think like protecting National Command Authority communications, the Permissive Action Link keys, Ohio-class Ballistic Missile Submarine movements, the Drill Schedule for the Minuteman silos, the minimum time standard for Delta Force selection [no shit, its probably the most heavily guarded secret in the Army] as well as some Continuity of Government operational plans, some specialized Emergency Response information like the composition of the Energy Department's Nuclear Emergency Support Team and FBI's Domestic Emergency Support Team, and the like).

      If you've ever heard of Suite A algorithms, then you understand what I mean. Suite B, what a normal member of society can get and use, is generally good enough with long enough key lengths, but for some things it simply isn't.

  2. FrankAlphaXII

    20 years late, but better than never

    Overblown paranoia about Intelligence agencies and an irrational clinging to privacy that we do not and never really had (in my opinion anyway) to sell newspapers aside, HTTP/1.0 should have mandated encryption as soon as RSA came off the Munitions List and as soon they started allowing remote financial transactions of any sort over the Internet, it was just expedient to not do so, I don't know if it was utopianist delusion, perceived technical limitations or human laziness but it was stupidity.

    But, hell, I personally welcome this idea. The American NSA/CSS, the Chinese Third Technical Department of the People's Liberation Army, the Russian Spetssviaz, Germany's BND, Israel's Unit 8200, and your GCHQ can most likely crack any kind of publicly available encryption anyway, so this isn't going to stop them, but it will make the lives of cybercriminals and thieves just that much harder, which everyone should welcome.

    I do wonder how many billions, if not trillions, of dollars/pounds/euros/renminbi/slips of Gold Pressed Latinum have been lost by unlawful interception of cleartext packets containing valuable information, whether by Governments, Criminals, Competing Businesses, or anyone else who has a vested interest in fraud or theft, when it could have been prevented in the first place by defaulting to HTTPS.

    1. Charles 9

      Re: 20 years late, but better than never

      "I do wonder how many billions, if not trillions, of dollars/pounds/euros/renminbi/slips of Gold Pressed Latinum have been lost by unlawful interception of cleartext packets containing valuable information, whether by Governments, Criminals, Competing Businesses, or anyone else who has a vested interest in fraud or theft, when it could have been prevented in the first place by defaulting to HTTPS."

      Probably not as much as you think as the spooks/malcontents already know how to pwn the endpoints where the encryption, by definition, has been removed. Since content must be plaintext to be useable, they just wait for that point.

      Furthermore, the subversion of CA's has demonstrated that secure communications between relative strangers is pretty much impossible as security theory can show. Alice and Bob can't trust each other because they've never met, so they need an intermediary, Trent, to vouch for each one. Gene therefore targets Trent instead. If we're not in a world of "Don't Trust Anyone," we're close.

    2. Anonymous Coward
      Anonymous Coward

      Re: 20 years late, but better than never

      50bn+ at any given time. Per industry. Per annum.

      Some cases are well known - for a particularly heavily subsidized darling industry with 2 major players in the red and blue corners. Some are less known. However the numbers just between the sides of the pond are in that range. If we add China, Russia, etc in the equation we are probably looking at 100s of bns per year for some industries to a trillion+ total.

  3. Anonymous Coward
    Anonymous Coward

    Listen for the whining

    Ok, a bit tongue in cheek, but if the NSA/GCHQ moan about it, do it, otherwise don't bother. The NSA went very quiet on encryption after the encryption debates of the late 90's, now we know why. The same goes for GCHQ, who initially complained about Labours mass surveillance project driving everyone to encrypt, but then went suspiciously quiet, even when Call Me Dave refloated the idea. If they stick to form (they are civil servants at heart after all, just with more money and a longer chain), they'll probably point in the right direction.

    At the very least, it makes the whole exercise of intercepting a billion selfies a bit more expensive in time, effort and capacity.

  4. Fazal Majid

    TLS needs to be fixed first

    TLS/SSL needs to be fixed before making it mandatory, otherwise it's just more security theater.

    The NSA can simply order a CA to issue them certificates suitable for man-in-the-middle attacks, or they can order a website operator to disclose their private key, as they did with Lavabit. Given that most TLS cipher suites do not provide perfect forward secrecy, this means they can easily retroactively decrypt your communications.

    1. Charles 9

      Re: TLS needs to be fixed first

      But at the same time you NEED the Certificate Authority to act as Trent in the Alice-Bob trust problem. Otherwise, they have NO way of knowing each is really who they claim to be. I think if Gene can target THIS Trent, they can basically target ANY Trent (even a peer-based Trent system by way of tactics similar to search engine gaming). Which takes us back to the problem: IS there a Trent that can't be beholden to this or any other Gene?

      1. Anonymous Coward
        Anonymous Coward

        Re: TLS needs to be fixed first

        No, you don't need a "certificate authority". You can use a formalised "web of trust", or you can just communicate with your correspondent in some other way on just one occasion. For example, phone them up and ask them for their key fingerprint. If the call is not planned in advance then it's very unlikely that anyone will be able to dynamically modify the contents of your phone call, which is what they'd have to do to cover up a man-in-the-middle attack on your Internet communication.

        Likewise, if amazon.com wants to make it possible for users to tell that they are communicating directly with the real amazon.com then they should put their key fingerprint in print advertisements and on their packaging, and browsers should make it easy to see the fingerprint, of course, which is hardly true at the moment. The current system is completely useless. Probably by design.

        1. Charles 9

          Re: TLS needs to be fixed first

          Like I said, ANY form of trust system (Trent, even the Web of Trust) can be subverted by a determined government agency (Gene). A large enough government can create a determined key-signing effort and subvert or compromise some of the identities.

          To defeat two-factor authentication, first you have to assume the party has a second factor at all (if the conversation is international, that's iffy). Second, if one party is a company, then Gene has a single point to subvert: MITM the line people would call to get the second factor.

          Similarly, for you Amazon web example, Gene can MITM all the public key displays, substituting their keys in the ads and relabeling their packages (remember, states have some of the biggest resources available in the name of security). OR they could use an insider to infiltrate and obtain Amazon's private key (some companies HAVE had their private keys compromised--that's how some signed malware slips through the radar).

      2. Dan 55 Silver badge

        Re: TLS needs to be fixed first

        You don't really need to know if who you're talking to is really them, that can be left to something else (e.g. DNSSEC, although that's probably breakable/broken too). You just need to encrypt and make mass interception more difficult.

        1. batfastad

          Re: TLS needs to be fixed first

          I always wondered why web browsers became hugely prejudiced against self-signed SSL certs, especially IE (of old) and other devices that just refuse to let you add an exception.

          I've always thought the concept of putting your trust in a central authority is a bit disgusting, as there's pretty much noone I trust more on this planet than myself (sad but true). It turns out giving everyone a gentle push towards commercial CAs certainly favours the snoopers! If you have everyone running around being their own "CA" with self-signed certs and root private keys then that's much harder to subvert than a few thousand commercial CAs, who have no choice but to do whatever the gov's law men tell them.

          I'd like to have seen some leniency given to self-signed certs if the cert's serial/modulus was also installed in a particular DNS TXT record for that domain. Then a visitor could be sure that whoever is in control of the web server is also in control of DNS for that domain. Though how do you guarantee the correct person is in charge of both! How would a vistor know their DNS lookup hasn't been intercepted to return a forged cert serial to complete the MITM attack. One answer could be DNSSEC... but then you ask, who has the root keys for your TLD!

    2. Suricou Raven

      Re: TLS needs to be fixed first

      Issuing false certificates, if used in a non-targeted manner, would be trivial to detect.

  5. John Smith 19 Gold badge
    Unhappy

    Sounds like they need *both* a better protocol than SSL and a better mechanism than TLS

    And doing something about email as well...

    I'm wondering....

    Peer-to-peer certificates? A "Web of trust" between users like an approval rating. Sure it can still be gamed but if you think this "CA" has been penetrated or is just a shell then accept nothing from any source that only lists them as who "vouches" for them.

  6. Anonymous Coward
    Anonymous Coward

    Caching

    It is all well and good to discuss how to protect the data in-flight, but what about caching.

    In this new scheme, will style sheets be cached by the browser, or will they be transferred in their entirety for each request? Are we about to sacrifice a lot of efficiency for a tiny bit of protection. Most web traffic does not need to be encrypted especially if the price is a huge increase in volume. This situation is particularly bad over mobile connections.

    1. Suricou Raven

      Re: Caching

      The browser can cache as per usual.

      I'm not sure how much web proxies can cache these days anyway. Almost all text and HTML is dynamic.

    2. Mr Flibble

      Re: Caching

      I run a caching proxy on my network. I'd certainly want that to continue working…

      As for dynamic content – yes, a lot of text is. Many images, style sheets and scripts are static content and, as such, ideal candidates for caching.

  7. Neil Barnes Silver badge

    I must admit to having wondered

    Why the (http at least) internet protocol was not encrypted from day one, or at least as soon as usable encryption came along.

    I wonder whether a public/private key system could be usable; each user talks to a server/website using a published-in-many-places website key and providing his own public key as part of the protocol.

    1. Suricou Raven

      Re: I must admit to having wondered

      Because Back In The Day people were still using the 386 processor. Encryption costs cycles. Today every desktop is at least a dual core and even most embedded devices can comfortably handle the extra load.

      1. Anonymous Coward
        Anonymous Coward

        Re: I must admit to having wondered

        It is not so much the desktop or embedded devices but the server with the content. With encryption built-in to CPU's, the penalty on the server side is very little to nil. For the most part, the CPU you have in a desktop, there is a very similar CPU in a server. So the desktop side is not a factor, the server is doing the work as many users are accessing it, not just your desktop.

    2. BristolBachelor Gold badge
      Coat

      Re: I must admit to having wondered

      I'd also add that in the beginning, a lot of protocols were debugged by hand - you'd use telnet to connect to a deamon running FTP HTTP or whatever, and talk to it, trying different possibilities to break it or find out why it just plain didn't work.

      That's probably a good reason for most stuff being in clear text and English/nmenonics.

  8. Werner McGoole

    Yes. Just do it!

    Next!

  9. Vimes

    ...ISPs use to help manage their traffic...

    ...and use it for their own commercial benefit too.

    Just look at TalkTalk's homesafe and how the likes of Bluecoat and other 'content categorisation' services (a polite way of referring to content scrapers IMO) can't work when SSL is being used. TalkTalk had to abandon it's reference to checking 'secure' websites - and in all likelihood probably did so because somebody at the company knew enough about how the web works to know that they simply could not do this.

    If the new version of HTTP makes it more difficult for the parasites like Huawei, Phorm, Bluecoat and others to exploit personal and private communications - often without the knowledge of those involved - then this can only be a good thing IMO.

    It's bad enough that government can get so easily into our private lives. At the very least we ought to be limiting invasions of privacy from the private sector - whose primary motivation in everything is profit.

    1. Anonymous Coward
      Anonymous Coward

      "Parasite"

      If Huawei is a "parasite", then GCHQ is Milzbrand*, I assume. Given the Belgacom incident.

      *As additional homework, I leave it to you to look up this word.

  10. an it guy

    what about cost

    imagine the small web user wants to do so. Adding SSL to a VPS is okayy, and not that difficult, but often not a point and click thing. It's also an additional cost. If Google were to add it to blogspot.com, then that would make things easy for quite a few people, but it's still a cost and burden that the 'masses' can;t afford technically or cost-wise

  11. poohbear

    No one has said anything about the costs for website owners to buy SSL certificates in this scheme.

    1. Vimes

      If you're using something like blogspot then I would assume that the shared nature of the service will mean that only one certificate will probably be needed.

      Something similar could apply to shared hosting in general. In either case the cost would probably be minimal to any individual user and would only be noticeable to anybody running their own site with it's own domain name that requires its own certificate. Even then the cost would not I think be too onerous.

      1. SImon Hobson Bronze badge

        >> If you're using something like blogspot then I would assume that the shared nature of the service will mean that only one certificate will probably be needed.

        >> Something similar could apply to shared hosting in general.

        That only applies if you are using a shared domain, eg user.blogspot.com. Once you start having your own domain name, then there is both a cost for the SSL cert for *your* domain, and the knock on cost of having to host each domain on a unique IP address. The latter is no problem with IPv6 - but it's certain to impact on anything still hosted with IPv4 (which means pretty well anything you want to be widely accessible at the moment).

        AIUI, the only way round the shared hosting issue would be to redefine the protocols so that the site name wasn't encrypted - but that's a significant weakening of the protection.

        1. storner

          No need for additional IP's

          You can have as many name-based virtual hosts you like on one IP. Also with SSL - it's called Server Name Indication (SNI), and any reasonably modern browser supports it. I am quite sure this will be a requirement for implementing HTTP/2.

          1. Laie Techie

            Re: No need for additional IP's

            > You can have as many name-based virtual hosts you like on one IP. Also with SSL - it's called Server Name Indication (SNI), and any reasonably modern browser supports it. I am quite sure this will be a requirement for implementing HTTP/2.

            Certificate exchange takes place before the client tells which hostname it wants; therefore, the server can send at most one certificate. Certificates which allow multiple domains (and not just multiple hostnames) are still expensive, limiting their use.

            http://en.wikipedia.org/wiki/Transport_Layer_Security#TLS_handshake

    2. An0n C0w4rd

      Trust

      Do you trust the SSL cert? The NSA, GCHQ, etc may be able to get a signing cert from somewhere and issue their own "fake" SSL certs on any box they like and have them accepted by the browsers as valid

      1. Yes Me Silver badge

        Re: Trust

        Worse: *anybody* can do that. The best trick is probably to generate a binary of an open source browser that happens to include some home-made CA certificates, post that binary on a juicy-looking free download site, sit back and wait. Also, at best, TLS only authenticates one end, even if the certificate is valid. The trust model for TLS is so broken that bad actors don't even need to worry about breaking the crypto.

    3. Mr Flibble

      DNSSEC and DANE would appear to be what's needed. (Expect resistance from the SSL certificates cartel.)

  12. Jamie Jones Silver badge

    Wrong layer! IPv6 ip-level encryption

    Better to do this at the transport layer if it's to be mandatory.

    I note that the IPv6 standard used ro mandate ip level security, but this is now optional:

    http://www.infosecurity-magazine.com/view/34405/did-the-nsa-subvert-the-security-of-ipv6/"

  13. An0n C0w4rd

    Encryption without authentication is pointless

    Encrypting the traffic by default is pointless unless you can authenticate that the system you think you are talking to is actually the system you want to talk to and not some intermediate spook system.

    Mandatory encryption would therefore fail to solve the NSA "problem" because of the lack of trust in the authentication systems, i.e. the certificate authorities. They've been proven to be the weak points in the system before. And if people *don't* use authenticated certificates, then the mandatory encryption is pointless.

    1. Anonymous Coward
      Anonymous Coward

      Re: Encryption without authentication is pointless

      Disagree totally.

      1. If the spooks do man-in-the-middle attacks against a large proportion of communication, then we'll soon find out about it. If they don't, then they won't be able to gather a huge archive of data for them to analyse later.

      2. If most communication is encrypted (but unauthenticated) then it will a lot easier for people who care about secure communication to do their encrypted communication with authentication without drawing attention to themselves (since the authentication can be done invisibly).

      1. Charles 9

        Re: Encryption without authentication is pointless

        Disagree on your disagree.

        1. You forget that massive storage center being built in Utah. They can do a "copy now, decrypt later" tactic and use cryptanalysis, spies, and black projects to obtain the keys later. Not only would copying not introduce lag if done on the side, but they've already shown a willingness to hoover up EVERYTHING to search for the one that gives the game away.

        2. Again, if the spooks just hoover everything up wholesale, then it doesn't matter if you're furtive or not. They'll get you anyway, anytime, anywhere, encrypted or not.

This topic is closed for new posts.

Other stories you might like