Lots of developers are used to live editing files on the server to fix bugs. Are they screwed now?
New Firefox, Chrome SRI script whip to foil man-in-the-middle diddle
Scripting will in the next few months become safer with Mozilla and Google adopting a validation mechanism to protect against man-in-the-middle attacks. The Subresource Integrity (SRI) check is being developed by boffins at Google, Mozilla, and Dropbox under the World Wide Web Consortium. The specification means the integrity …
COMMENTS
-
-
Thursday 4th June 2015 07:54 GMT wolfetone
I wouldn't say they're screwed. If bugs are fixed properly then it's done at a local level, then pushed to the development server. In that process an opportunity to hash files will be present.
I think every developer, myself included, have edited at a live level for one reason or another. But security has to be important, and if it means one extra step to the proper work flow and to actually use that work flow to help secure visitors to websites, I'm happy enough to do that.
-
-
Thursday 4th June 2015 11:58 GMT Dan 55
Presumably you could get the browser to look at a hash value stored in a .json file on the same server (so you'd have a .js and a .json next to it) so you'd need to edit two files instead of one, or maybe the server itself could generate it if it's passed to the browser by a header.
We don't know what happens when it the hash check fails. Maybe the .js file is ignored and the cached one is used until it expires.
-
-
-
-
Thursday 4th June 2015 07:49 GMT FF22
This is not against MITM attacks
Contrary what the article implies, this technology has obviously not been developed to thwart primarily MITM attacks, because it can't defend against those in general.
That said there are some MITM attacks it can prevent, but those are only a very specific and small subset, where the man in the middle can manipulate only the external resources, but not the referring page. But the typical proxy, corporate or ISP level MITM attacks will still work, because the attacker can also manipulate the HTML file with the hash values in it to match those of the manipulated script files'.
On the other side this is completely anti-web, because it effectively kills the dynamic nature of the web, which is the very essence of it. With the hash check in place change to the develivery and representation of the external resources (to adapt to for ex. bandwidth or device constraints) will not be possible anymore, let alone talk about applying bug fixes, correcting typos or other kinds of well-intentioned modifications.
Actually, it's kinda pointless, because if someone is so much worried about the modification of external resources, then they should just host the resources themselves, which would prevent any and all kind of MITM (and any other type of attacks, for that matter) that this technique can be effective against, but which wouldn't require new browsers or extensions to actually do that.
-
Thursday 4th June 2015 12:52 GMT Anonymous Coward
Re: This is not against MITM attacks
"... they should just host the resources themselves..."
I do, and while I wear a 7-layer tin-foil hat, I know how my pages work with the version of the software that I've downloaded and linked-to locally. It doesn't bother me that I'm more than 50 versions out-of-date, either.
-
-
Thursday 4th June 2015 08:32 GMT DaLo
"Marier also urged organisations to add themselves to the browser pre-load list which requires sites to run HTTP strict transport security (HSTS)."
This does not seem very scalable; the preload list is hard coded into transport_security_state_static.json as part of the build (in chromium). At the moment it has about 2100 domains in the list but if every organisation is encouraged to add themselves, surely the list and code will rapidly become unmanageable?
-
-
-
-
-
Thursday 4th June 2015 19:37 GMT Charles 9
Re: Where are the hashes
No I'm talking a malware SSL proxy relay that's masquerading as the target site. With a fake certificate, they can pass off as the target, your browser gets the green light because it's secure, but the proxy can decrypt and alter the traffic to and from the actual target and you basically have no way to tell the difference. The corporate SSL relay is basically a legit version of the malware SSL proxy.
In any event, this malware proxy can masquerade as either the site host, the script host, or both, allowing the altering of script and signature no matter where it comes from. It's back to the classic "Who do you trust?" issue.
-
Monday 8th June 2015 11:02 GMT DaLo
Re: Where are the hashes
Eh? Where would they get the fake certificate from? A corporate proxy works because it administers the machines 'below' it. Therefore it creates a trusted root certificate in the computer's certificate store allowing it to impersonate another site.
An unconnected third party doesn't have that luxury. Yes there are a few instances of a compromised trusted certificate store etc but they are relatively rare outside of state control.If there is an easy way to install trusted root certificates on a user's PC then the whole premise of SSL is broken and there is much more to worry about than a rogue script.
At the moment the consensus is that SSL generally works and it is that premise that allows for a secure internet.
As to the original point it then also would not matter about whether the hash and the script had been simultaneously compromised as it would not matter at all whether the site you were visiting used scripts or not, your connection is no longer secure and you are being fed whatever the attacker wants to feed you whether you ask for it or not.
-
-
-
-
-
-
Thursday 4th June 2015 11:31 GMT Anonymous Coward
The fact that...
... the majority of web "developers" are quite happy to link to API code from an external site that they've probably never even looked at much less verified to run their javascript pretty much says it all about how my they know or care about security. Could you imagine a C++ or java dev #includ'ing some random code or linking with a binary lib or jar file from offsite on each compile? Exactly.
The whole javascript software model is fundamentally broken. Any dev worth his salary would download the code and at least give it a once over first then stick it on his own website to be accessed locally when the page loads which not only makes it faster but also prevents this sort of code injection being an issue.
-
Thursday 4th June 2015 13:27 GMT Anonymous Coward
Re: The fact that...
"The whole javascript software model is fundamentally broken. Any dev worth his salary would download the code and at least give it a once over first then stick it on his own website to be accessed locally when the page loads which not only makes it faster but also prevents this sort of code injection being an issue."
Nope, because a MITM that can masqerade as the source will also happily alter the LOCAL copies, which may indeed be stale with security holes. Polling from the originator at least has the benefit that it's by default the fastest to an official fix.
The cynic in me says that the entire model of the Internet is broken against the human condition. In fact, the whole idea of mass communication has this problem. It relies on a level of trust that simply cannot be guaranteed. Without experience (the dreaded "First Contact" problem), there is simply no way for Alice to know Bob is really Bob. Not even Trent is of help since there's no real way to know Trent is really Trent, and this can recurse infinitely.
-
Thursday 4th June 2015 14:37 GMT Anonymous Coward
Re: The fact that...
"Nope, because a MITM that can masqerade as the source will also happily alter the LOCAL copies,"
Unless he's got acces to every web server or he's a man in the middle on every single router on the internet thats unlikely to happen. With one point of origin for a javascript API he only has to hack into 1 system.
-
-
Friday 5th June 2015 10:39 GMT FF22
Re: The fact that...
Seems like the whole point of the internet and the web (ie. that they're not local, but distributed) went wooosh over your head.
Oh, and btw practically all C++ and Java programs DO link to and call into code that they do not verify on each run in the same way web pages do with JavaScript. See dynamically linked libraries and Java packages!
-
Friday 5th June 2015 13:52 GMT Anonymous Coward
Re: The fact that...
And it seems the whole WEAKNESS of the internet and the web (ie. that resources are impossible to accurately attribute, meaning you can be trojan-horsed without your knowledge) got stuck on your neck. Combined with a global attitude approaching "Don't Trust Anyone," we're gonna need something rather more complex and comprehensive than this "script whip" to solve the problem.
Assuming, of course, the problem is actually tractable. Then again, we've yet to figure out how to solve the First Contact problem (Alice and Bob vouching identities when they've never met before) without some sort of Trent (that can HIMSELF be subverted).
-
-
-
-
Friday 5th June 2015 16:30 GMT Stevie
Re: Bah!
No it isn't useful. Client-side JavaScript allows people to add shiny to webpages, that's all, and to add shiny it has to leave us open to stupid attack vectors that the JavaScripters are incapable of defending against. It offers no functionality that cannot be lived without safely, since any user-generated "content" that is sent down the pipe will have to be validated at the server anyway (unless one doesn't subscribe to common sense practices learned back in the bad old days of Mainframes, Cobol and transaction processing - and one deserves all one gets if one doesn't but one's customer base deserves better).
If you use JavaScript on the server, all I can ask is "why?" since there are many technologically and ideologically better options available, options that provide more features with better OO implementation on whatever OS you happen to be using.
And I don't have to explicitly show you or anyone else jack spit. All I have to do is sit back and sigh every time another round of pwnership is laid at the feet of this boil on the backside of the web.
-
Sunday 7th June 2015 15:38 GMT Anonymous Coward
Re: Bah!
"No it isn't useful. Client-side JavaScript allows people to add shiny to webpages, that's all, and to add shiny it has to leave us open to stupid attack vectors that the JavaScripters are incapable of defending against."
OK, how do you do something like a live board or another dynamic web page WITHOUT client-side code OR constant and annoying (as in customers will probably abandon you) refreshing? Without stuff like Ajax, you'll have to do large and time-consuming PostBacks rather than the more-efficient CallBacks: again a major headache for web apps. Not to mention doing everything server-side will put additional strain on the servers in a day increasingly where every Hertz counts. The only alternatives to JavaScript are Java, Flash, and Silverlight. Guess what? They're more despised than JavaScript! What you're saying is that we have to abandon the web app altogether. News flash: The web app now is like alcohol during Prohibition; people will abandon you before they abandon the web app.
"And I don't have to explicitly show you or anyone else jack spit. All I have to do is sit back and sigh every time another round of pwnership is laid at the feet of this boil on the backside of the web."
Yes you do, or your argument wouldn't survive a court (and this IS a court in a sense—a court of public opinion). Specifics or you're making a hollow case, in which case you will likely be ignored. I at least specifically note why JavaScript is still needed (because server-side can't do dynamic content).
-
-
Monday 8th June 2015 05:20 GMT Charles 9
Re: Bah!
So you call an online collaborative whiteboard bollox? An online graphical language recognizer for an elaborate language like Chinese or Japanese where specific stroke order is important? A place like eBay where timing is key (the delay of a page load can be the difference between winning and losing an auction--I can speak from experience).
So you ask WHO says web pages have to be dynamic? YOUR CLIENTS DO! Time is money in today's society, so static pages are passe.
-
-
-
-
-
-
Sunday 7th June 2015 15:17 GMT Michael Wojcik
jQuery
he found 2.5 million references in Github accounts to a JQuery script hosted on a Google server.
"What could possibly happen if someone gets into that server? An attacker could add a malicious payload ..."
What, more malicious than jQuery itself? The mind boggles.
Why so many people are so eager to use a library written by people who not only don't understand the language they're working in1, but are prone to throwing tantrums when implementations follow the language specification rather than the jQuery author's personal beliefs2, is beyond me.
Though I suppose few of the Javascript libraries are better. And we can hardly expect self-professed Javascript developers to, y'know, develop their own.
1See, for example, the long, sad history of jQuery code trying to do "if (typeof x == 'array')".
2Most prominently Reisig's hissy fit over Chrome's correct implementation of property iteration, which didn't match his fantasy of how it should work.