Innocent websites were blocked and labelled phishers on Wednesday following an apparent conflict between OpenDNS and Google's Content Delivery Network (CDN). OpenDNS - a popular domain name lookup service* - sparked the outage by blocking access to googleapis.com, Google's treasure trove of useful scripts and apps for web …
"..for the uninitiated.."
"DNS, for the uninitiated, is the vital system that points browsers at the correct servers.."
For heaven's sake, this is El Reg, not AOL. How many readers don't know what DNS is?
mmm well personally
I think El Reg could have made more of a point that this will ONLY effect those who have opted to use openDNS (or whose ISP pushes them out as the default servers).
To be fair...
...only yesterday a commenter here said that although he's not in the industry and not necessarily tech-literate, he does enjoy reading many of the articles and learning.
So although he could probably go off and Google "DNS", it's a nice touch for people like him, and totally harmless for people like you and I. Well, unless you feel threatened by a red-top appearing to talk down at you, that is. ;-)
You're a blogger ... right?
That struck me as odd as well.
But given the downvotes I guess we are in a minority.
I don't think the bootnote was particularly useful to the uninitiated anyway. But that's another discussion.
Those who pick it up on Google News because they are searching to find out why their favourite website isn't working?
The arrow affected the aardvark ...
... the effect was electric. (Borrowed)
But I do agree.
Although, I find it quite annoying when sites use the Google APIs since enabling them for a page (FF+NoScript) means 3rd party sites can use them too. It's particularly annoying on pages that use JS links for no good reason.
...or Computer Weekly ;-)
All the more reason
What cupid stucking funts are downvoting this point? I'm guessing crappy web developers who should never be allowed to work in the industry.
Ah Google. I've found that the latest version of Kaspersky (2012) doesn't like Chrome so it's not just OpenDNS that seem to have taken a disliking to them at the moment.
to be fair to Kaspersky, its entirely their faulty "sandbox" security implementation, Safe Run for X, under chrome and IE.
Firefox, it does work with. so if you're using Safe Run for internet banking / outlook, switch to firefox for your security stuff.
it's only under x64. it is on their site.... so it's just something they advertise,
then don't provide. Classy, but it's SOP for a $100+ product.
i.e. details from sandboxie, who have a working x64 sandbox, http://www.sandboxie.com/index.php?ExperimentalProtection
Safe Run for Applications, the component of Kaspersky Internet Security 2012, doesn’t work with Microsoft Windows XP / Vista / 7 x64.
Safe Run for Websites, the component of Kaspersky Internet Security 2012, doesn’t work with Microsoft Windows XP x64, and works with limitations on Microsoft Windows Vista x64 and Microsoft Windows 7 x64.
"OpenDNS is a globally available free DNS service"
According to OpenDNS
Apart from the obvious cheap crack that "you get what you pay for"
Did anyone who was not using OpenDNS suffer the same trouble?
google also has public DNS
188.8.131.52 or 184.108.40.206 for those who want to use it.
But then Google will KNOW about my DNS queries?? What about my privacy!?
Working properly then....
"The fact the issue popped up suddenly on Wednesday would suggest that engineers at Google had been fiddling with SSL certificates"
I wondered why a site I use went offline for a few hours...
So OpenDNS system saw the SSL certificates as potentially dodgy and took action to protect its users from sites using SSL certificates it didn't see as authentic? I'd call that proof that it is doing what it says on the tin and its one of the reasons I use OpenDNS.
Re: what it says on the tin
Perhaps. Of course, it isn't actually the job of a DNS server to decide whether the answer to your query is safe to use. If there is a problem with the certificates on the target site, it is the client's job to decide how to handle that. But if you've punted that responsibility to OpenDNS, then they are indeed doing what you ask.
Either way, if people are now migrating to the MS alternative, it looks like Google have paid the penalty regardless of whose fault it is.
>> it isn't actually the job of a DNS server to decide whether the answer to your query is safe to use <<
But for most of openDNS's users a major reason to use the service is the <b>optional, configurable</b> nuisance filters.
99% of the use of googleapi seems to be to serve Free and OSS libraries, that could easily reside on the primary sites host, without introducing unnecessary privacy intrusion and opening the visitor to potentially dangerousthird party scripts..
i.e. did google properly bugger it's certificates or is openDNS in need of some work? If Kaspersky is having issues as well I'd be more inclined to blame google
This is why you don't load your JS libraries remotely from Google!
That it is even possible
to load scripts from other domains is a security hole the size of the Blackwall Tunnel.
Does it really make sense to download the same piece of code, time and time again, from every site you visit? If everybody loads jQuery from one or two CDNs, then the chances are it will be in the browser's cache already. (I clear my cache on exit, but it's there.)
The issue here was not having a backup for Google.
"Because someone could go wrong with the supplier" is always a downside to using someone else's service. However "because they can provide the service faster and more reliably than you can" is still a more compelling upside, along with "without charging for it".
Given how often jQuery versions update, and how many are available, this effect is somewhat negated by the user having 1.5, 1.5.1, 1.6, 1.7 etc, but not 1.7.1. Unless you want to always bind to the latest version of an available library, which is asking for trouble when it updates.
I prefer to have a version of the code available on my site where it can't go away. The extra few millis to load the page are less important to me than it always working. Plus, I often implement code to bundle JS scripts together to save on requests, so the speed saving is negligible.
another nail in the coffin
..of sites that can't work without reference to a zillion bits of other peoples code.
I use NoScript, and I'm used to having to temporarily enable the domain that I'm using. But some sites are just awful and you end up going through seemingly endless domains just to load the damned content. There are some sites where virtually everything seems to rely on a remote script to load. Madness.
NoScript temporary enable.....
On NoScript: Options>General tab, you can allow top level domain temporary permission by default. This means you can open website.com and it will temporarily allow scripts from website.com for that session. It saved a bit of wear and tear on my fingertips when I found that out.
Trouble with that is if a site you use has been hacked and issues a redirect to a dodgy site then all the dodgy site's scripts get free reign.
STUPID WEBSITE DEVELOPERS
By that token, you could say that any website that relies on CSS to look good is not a website.
Your attitude is about ten years out of date.
Besides, here we're not talking about sites which ONLY work with JS. Whether you use JS to save page loading times, or to style elements, or to provide a full app experience, you will be equally hit by this problem.
Re: STUPID WEBSITE DEVELOPERS
Correct, it's a webapp.
Happy New Year! For 2008.
Grow up you arse. Years ago people said the same about CSS and images - time and life moves on. Many, many commercial developers are driven by strict requirements and guidelines and JS is neccessary. Just because some prick like yourself decides to disable functionality does not mean that the site should still work in all its glory.
For fu**s sake...
True, from a certain point of view
"Google site blocked. Google is a fraudulent attempt to get you to provide personal information under false pretenses."
Just because we were scrabbling around fixing sites doesn't mean those sites were crippled, just that they should have been working better.
That's a rather simplistic view
The rear-seat reading lights in my car have been out for, oh, seven years or so. Since they're there, they must be there to do a job. Since they're not functioning, the car is not working as intended. I suppose I'd better attend to that before driving it again!
Much software (including most user-facing applications for general-purpose computers) these days is loaded with features that a majority of its users never use and could not care less about. Often users are happier when such "features" are not functioning, in fact. Consider Clippy, for example. Or the recent complaints on the Reg about resizing ads, and accompanying expressions of glee from users who have script- and/or ad-blocking browsers.
Many web sites use scripting to accomplish nothing useful, or provide convenience features such as client-side form pre-validation (which is often done so poorly that it's worse than omitting it would have been) or prompting (which saves, what, a few seconds at best?). Since such sites should fall back gracefully in the event of script blocking, there shouldn't be any need to "scrabble around fixing" them.
For that matter, such simple functionality shouldn't be implemented with bloated, error-ridden scripting frameworks (like jQuery) written by people who can't be bothered to read the spec and throw a hissy fit when confronted with an implementation that conforms to it rather than to their preconceived notions (Reisig).
Are there web apps which are fundamentally built on client-side code, and so have to have working scripts in order to do anything useful? Yes (for various values of "useful"). But if those sites are really important to someone, they shouldn't depend on third-party-hosted code, as other people have already pointed out; and if they do, for some reason, then they should already be prepared to handle that failure mode.
I don't use OpenDNS...
...my DNS is provided by my DSL provider, so I haven't noticed any sites falling over owing to their inability to load scripts from googleapis.com. Of course, I have NoScipt set to Block Scripts Globally, and manually allow scripts as needed to provide any important functionality; last I checked, I had sites like googleapis and googleanalytics tagged as "untrusted" in NoScript, so I don't load them anyway.