back to article New Firefox, Chrome SRI script whip to foil man-in-the-middle diddle

Scripting will in the next few months become safer with Mozilla and Google adopting a validation mechanism to protect against man-in-the-middle attacks. The Subresource Integrity (SRI) check is being developed by boffins at Google, Mozilla, and Dropbox under the World Wide Web Consortium. The specification means the integrity …

  1. Tom Chiverton 1

    Lots of developers are used to live editing files on the server to fix bugs. Are they screwed now?

    1. wolfetone Silver badge

      I wouldn't say they're screwed. If bugs are fixed properly then it's done at a local level, then pushed to the development server. In that process an opportunity to hash files will be present.

      I think every developer, myself included, have edited at a live level for one reason or another. But security has to be important, and if it means one extra step to the proper work flow and to actually use that work flow to help secure visitors to websites, I'm happy enough to do that.

      1. Tom Chiverton 1

        Snap.

        What will actually happen though is a bug will be found, fixed, and then *the entire site* will crash to a halt when the .js fails to load.

        There will then be several hours of panic before anyone remembers this 'feature' and updates wherever the hashes are.

        1. Dan 55 Silver badge

          Presumably you could get the browser to look at a hash value stored in a .json file on the same server (so you'd have a .js and a .json next to it) so you'd need to edit two files instead of one, or maybe the server itself could generate it if it's passed to the browser by a header.

          We don't know what happens when it the hash check fails. Maybe the .js file is ignored and the cached one is used until it expires.

    2. Michael Wojcik Silver badge

      They were already screwed, since they're editing files in production.

      Lots of C programmers are used to manipulating buffers without checking for overflow. Forcing them to switch to length-checking "safe" functions in the standard library doesn't make their situation worse.

  2. FF22

    This is not against MITM attacks

    Contrary what the article implies, this technology has obviously not been developed to thwart primarily MITM attacks, because it can't defend against those in general.

    That said there are some MITM attacks it can prevent, but those are only a very specific and small subset, where the man in the middle can manipulate only the external resources, but not the referring page. But the typical proxy, corporate or ISP level MITM attacks will still work, because the attacker can also manipulate the HTML file with the hash values in it to match those of the manipulated script files'.

    On the other side this is completely anti-web, because it effectively kills the dynamic nature of the web, which is the very essence of it. With the hash check in place change to the develivery and representation of the external resources (to adapt to for ex. bandwidth or device constraints) will not be possible anymore, let alone talk about applying bug fixes, correcting typos or other kinds of well-intentioned modifications.

    Actually, it's kinda pointless, because if someone is so much worried about the modification of external resources, then they should just host the resources themselves, which would prevent any and all kind of MITM (and any other type of attacks, for that matter) that this technique can be effective against, but which wouldn't require new browsers or extensions to actually do that.

    1. Anonymous Coward
      Anonymous Coward

      Re: This is not against MITM attacks

      "... they should just host the resources themselves..."

      I do, and while I wear a 7-layer tin-foil hat, I know how my pages work with the version of the software that I've downloaded and linked-to locally. It doesn't bother me that I'm more than 50 versions out-of-date, either.

  3. DaLo

    "Marier also urged organisations to add themselves to the browser pre-load list which requires sites to run HTTP strict transport security (HSTS)."

    This does not seem very scalable; the preload list is hard coded into transport_security_state_static.json as part of the build (in chromium). At the moment it has about 2100 domains in the list but if every organisation is encouraged to add themselves, surely the list and code will rapidly become unmanageable?

  4. Velv
    Boffin

    "The Subresource Integrity (SRI) check is being developed by boffins at ..."

    This is software code. Written by button monkeys. Button monkeys, no matter how smart, are never "boffins". Boffins wear white coats. I doubt the button monkeys know what a coat is.

  5. Tom Chiverton 1

    Where are the hashes

    If the hashes are on the including page, any MITM will just modify that to remove them ?

    If they are on a 3rd party webservice, that's massive overhead, and the MITM will just block access ?

    1. DaLo

      Re: Where are the hashes

      This is more a remote MITM or, more relevant, a script server compromise. If they compromise the script server then the web host would serve up the hash and the inconsistency would be found. A local MITM would not be mitigated but this would be stopped by SSL.

      1. Charles 9

        Re: Where are the hashes

        Not if the MITM is ALSO a secure proxy using a masquerading certificate, which HAS occurred and IS the norm in enterprise settings.

        1. DaLo

          Re: Where are the hashes

          Why would your enterprise need to hack your machine? If a crim has access to your enterprise proxy then you have greater worries than a dodgy script.

          1. Charles 9

            Re: Where are the hashes

            No I'm talking a malware SSL proxy relay that's masquerading as the target site. With a fake certificate, they can pass off as the target, your browser gets the green light because it's secure, but the proxy can decrypt and alter the traffic to and from the actual target and you basically have no way to tell the difference. The corporate SSL relay is basically a legit version of the malware SSL proxy.

            In any event, this malware proxy can masquerade as either the site host, the script host, or both, allowing the altering of script and signature no matter where it comes from. It's back to the classic "Who do you trust?" issue.

            1. DaLo

              Re: Where are the hashes

              Eh? Where would they get the fake certificate from? A corporate proxy works because it administers the machines 'below' it. Therefore it creates a trusted root certificate in the computer's certificate store allowing it to impersonate another site.

              An unconnected third party doesn't have that luxury. Yes there are a few instances of a compromised trusted certificate store etc but they are relatively rare outside of state control.If there is an easy way to install trusted root certificates on a user's PC then the whole premise of SSL is broken and there is much more to worry about than a rogue script.

              At the moment the consensus is that SSL generally works and it is that premise that allows for a secure internet.

              As to the original point it then also would not matter about whether the hash and the script had been simultaneously compromised as it would not matter at all whether the site you were visiting used scripts or not, your connection is no longer secure and you are being fed whatever the attacker wants to feed you whether you ask for it or not.

  6. Anonymous Coward
    Anonymous Coward

    The fact that...

    ... the majority of web "developers" are quite happy to link to API code from an external site that they've probably never even looked at much less verified to run their javascript pretty much says it all about how my they know or care about security. Could you imagine a C++ or java dev #includ'ing some random code or linking with a binary lib or jar file from offsite on each compile? Exactly.

    The whole javascript software model is fundamentally broken. Any dev worth his salary would download the code and at least give it a once over first then stick it on his own website to be accessed locally when the page loads which not only makes it faster but also prevents this sort of code injection being an issue.

    1. Anonymous Coward
      Thumb Up

      Re: The fact that...

      I tried, but The Register only allows me to upvote you once.

    2. Anonymous Coward
      Anonymous Coward

      Re: The fact that...

      "The whole javascript software model is fundamentally broken. Any dev worth his salary would download the code and at least give it a once over first then stick it on his own website to be accessed locally when the page loads which not only makes it faster but also prevents this sort of code injection being an issue."

      Nope, because a MITM that can masqerade as the source will also happily alter the LOCAL copies, which may indeed be stale with security holes. Polling from the originator at least has the benefit that it's by default the fastest to an official fix.

      The cynic in me says that the entire model of the Internet is broken against the human condition. In fact, the whole idea of mass communication has this problem. It relies on a level of trust that simply cannot be guaranteed. Without experience (the dreaded "First Contact" problem), there is simply no way for Alice to know Bob is really Bob. Not even Trent is of help since there's no real way to know Trent is really Trent, and this can recurse infinitely.

      1. Anonymous Coward
        Anonymous Coward

        Re: The fact that...

        "Nope, because a MITM that can masqerade as the source will also happily alter the LOCAL copies,"

        Unless he's got acces to every web server or he's a man in the middle on every single router on the internet thats unlikely to happen. With one point of origin for a javascript API he only has to hack into 1 system.

        1. Charles 9

          Re: The fact that...

          And if the masquerade occurs at a major chokepoint, like the ISP, then the malware (which may be the ISP or a government entity) has a lot of traffic to exploit.

    3. FF22

      Re: The fact that...

      Seems like the whole point of the internet and the web (ie. that they're not local, but distributed) went wooosh over your head.

      Oh, and btw practically all C++ and Java programs DO link to and call into code that they do not verify on each run in the same way web pages do with JavaScript. See dynamically linked libraries and Java packages!

      1. Anonymous Coward
        Anonymous Coward

        Re: The fact that...

        And it seems the whole WEAKNESS of the internet and the web (ie. that resources are impossible to accurately attribute, meaning you can be trojan-horsed without your knowledge) got stuck on your neck. Combined with a global attitude approaching "Don't Trust Anyone," we're gonna need something rather more complex and comprehensive than this "script whip" to solve the problem.

        Assuming, of course, the problem is actually tractable. Then again, we've yet to figure out how to solve the First Contact problem (Alice and Bob vouching identities when they've never met before) without some sort of Trent (that can HIMSELF be subverted).

  7. Stevie

    Bah!

    Get Rid Of Useless Javascript Now!

    1. Charles 9

      Re: Bah!

      Except it's MUCH more useful than you give it credit. You'll have to explicitly show what anyone using JavaScript can use in its place or no one will switch. Period.

      1. Stevie

        Re: Bah!

        No it isn't useful. Client-side JavaScript allows people to add shiny to webpages, that's all, and to add shiny it has to leave us open to stupid attack vectors that the JavaScripters are incapable of defending against. It offers no functionality that cannot be lived without safely, since any user-generated "content" that is sent down the pipe will have to be validated at the server anyway (unless one doesn't subscribe to common sense practices learned back in the bad old days of Mainframes, Cobol and transaction processing - and one deserves all one gets if one doesn't but one's customer base deserves better).

        If you use JavaScript on the server, all I can ask is "why?" since there are many technologically and ideologically better options available, options that provide more features with better OO implementation on whatever OS you happen to be using.

        And I don't have to explicitly show you or anyone else jack spit. All I have to do is sit back and sigh every time another round of pwnership is laid at the feet of this boil on the backside of the web.

        1. Anonymous Coward
          Anonymous Coward

          Re: Bah!

          "No it isn't useful. Client-side JavaScript allows people to add shiny to webpages, that's all, and to add shiny it has to leave us open to stupid attack vectors that the JavaScripters are incapable of defending against."

          OK, how do you do something like a live board or another dynamic web page WITHOUT client-side code OR constant and annoying (as in customers will probably abandon you) refreshing? Without stuff like Ajax, you'll have to do large and time-consuming PostBacks rather than the more-efficient CallBacks: again a major headache for web apps. Not to mention doing everything server-side will put additional strain on the servers in a day increasingly where every Hertz counts. The only alternatives to JavaScript are Java, Flash, and Silverlight. Guess what? They're more despised than JavaScript! What you're saying is that we have to abandon the web app altogether. News flash: The web app now is like alcohol during Prohibition; people will abandon you before they abandon the web app.

          "And I don't have to explicitly show you or anyone else jack spit. All I have to do is sit back and sigh every time another round of pwnership is laid at the feet of this boil on the backside of the web."

          Yes you do, or your argument wouldn't survive a court (and this IS a court in a sense—a court of public opinion). Specifics or you're making a hollow case, in which case you will likely be ignored. I at least specifically note why JavaScript is still needed (because server-side can't do dynamic content).

          1. Stevie

            Re: Bah!

            Who says web pages have to be dynamic? Shinyshinyshiny bollox. Notghing we do on the web these days *requires* dynamic scripting of the web page at the client.

            And this isn't a court of any kind. As far as you and your JavaScript friends are concerned, it is a claque.

            1. Charles 9

              Re: Bah!

              So you call an online collaborative whiteboard bollox? An online graphical language recognizer for an elaborate language like Chinese or Japanese where specific stroke order is important? A place like eBay where timing is key (the delay of a page load can be the difference between winning and losing an auction--I can speak from experience).

              So you ask WHO says web pages have to be dynamic? YOUR CLIENTS DO! Time is money in today's society, so static pages are passe.

  8. Anonymous Coward
    Anonymous Coward

    So if I understand correctly

    A vast majority of the web developers are now nothing more than some sort of white-hat script-kiddies.

    Who would have known that!

  9. Michael Wojcik Silver badge

    jQuery

    he found 2.5 million references in Github accounts to a JQuery script hosted on a Google server.

    "What could possibly happen if someone gets into that server? An attacker could add a malicious payload ..."

    What, more malicious than jQuery itself? The mind boggles.

    Why so many people are so eager to use a library written by people who not only don't understand the language they're working in1, but are prone to throwing tantrums when implementations follow the language specification rather than the jQuery author's personal beliefs2, is beyond me.

    Though I suppose few of the Javascript libraries are better. And we can hardly expect self-professed Javascript developers to, y'know, develop their own.

    1See, for example, the long, sad history of jQuery code trying to do "if (typeof x == 'array')".

    2Most prominently Reisig's hissy fit over Chrome's correct implementation of property iteration, which didn't match his fantasy of how it should work.

    1. captain veg Silver badge

      Re: jQuery

      Upvoted.

      There was a time when jQuery was useful for papering over the cracks of incomplete and/or non-conforming ECMAScript implementations. That time has passed. Voluntary use of jQuery in new code is, at best, laziness.

      -A.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like