Destroying Firefox from within
This will simply kill off what's actually still a pretty good browser. It's the idiots way to a new web and it won't work.
Firefox shop Mozilla recently became the latest in the long line of companies big and small trying to push the web from HTTP to the more secure HTTPS protocol. In the post-Snowden world, where everyone from the NSA, GCHQ to your ISP is inspecting and sometimes altering content, HTTPS (which makes such things nearly impossible …
Indeed, they tried to with pointless GUI changes and "lets copy Chrome" approach to hide and obscure menus and other featured of interest.
Why stop there? Why not also break access to your local router, home NAS, and lots of old but interesting sites which the owners of have died, moved on, or otherwise given up maintenance on?
Really, some developers seem to spend their day in a jerk-circle reassuring each other that they are right and the rest of the world (i.e. the users of their product) are fools for not agreeing.
Well, I tend to agree that it's not a good idea but I also agree that it doesn't break things as badly as has been made out.
As far as I can see old sites won't be effected as they won't use new features anyway, to take one example.
Again, I agree with the article that SSL isn't without cost but I disagree that it's hard to set-up. There are loads of "guides" available on the Web even if your SSL certificate provider doesn't provide you with one.
... is that sites which lurk around, turn up on searches, but which haven't been modified for years and may even be for businessess which are now defunct, should finally disappear completely.
However that's not a good enough reason for Mozilla to try to force this change on everyone just because they can.
"...sites which lurk around, turn up on searches, but which haven't been modified for years and may even be for businessess which are now defunct, should finally disappear completely."
And that is upside how? It just seems to be downside to me.
Sites with information that is years old but valid should not need to be (and should not be) modified. We are not all MIllennials who need shiny!
The National Museum of Computing just issued an appeal for a data-sheet for an old piece of equipment they are trying to restore or emulate or something. Backed up with an offer of a financial reward! So, it is pretty important to them. If the web had been around then, and someone had the information they need on their old website, it would now be valuable.
In which case, there are such things as The Wayback Machine which archive information.
But defunct sites will no longer appear in search results.
@Graham Marsden
and how are you supposed to find vital information that is only on an old website, if it doesn't turn up in searches?
A lot of websites which just give out static information don't need to be encrypted and if it is an old site that is no longer maintained, but is kept going, because it provides important information that is still needed, why should they have to invest in getting it HTTPS compliant? They are offering the information as a public service, just Mozilla has decided that isn't good enough.
It isn't just about businesses with developers, it's about typical users with their own .me domain name hosting their blog about their kitten, they keep up to date with their updates, and their blog has all the HTML5 bells and whistles........... and is blocked from Firefox because they now have to go and get a certificate, install the certificate and, when their certificate is about to expire, renew the certificate or get a huge "this site is dangerous, don't go there" message.
It isn't just about businesses with developers, it's about typical users with their own .me domain name hosting their blog about their kitten
So in your mind, the "typical user" runs and maintains their own VPS, on which they manually install and configure wordpress and apache? And, despite their apparent server-side expertise, they're incapable of adding a few lines to their apache config to enable SSL?
Are you sure the "typical user" won't actually be using a hosted service for their blog, where all the complicated back-end stuff is actually done by a business with developers?
> Are you sure the "typical user" won't actually be using a hosted service for their blog, where all the complicated back-end stuff is actually done by a business with developers?
Quite probably yes.
But, they won't be doing a lot of stuff for free. I have a site which was a blog but which is now pretty static - I've had no reason to update it for some time. I started it <cough> years ago when information on the subject was "closely guarded" by vested interests who didn't want DIYers to know how to do stuff for themselves. Things have changed, information is out there, but a lot of the information - whilst old - is still relevant and useful. I know it gets linked to some of the "how and why to do this" articles from quite a few forum posts.
But, it's hosted with the "free" hosting that comes with my internet package. It's HTTP only, and HTTPS is not offered on the host - as least, not under my domain name. If I want to SSL enable the site I'll have to move it, pay to host it on another site, and probably pay extra for SSL.
Mind you, I might need to move it anyway. The server it is on does have an SSL service running - but not for customer sites. So the "helpful" browsers that try SSL first come up first with a certificate warning, and if that doesn't scare the user away, they get a completely unrelated ISP site.
Extra for the cert itself - even if the hosting provider deal with the technicalities and automates the human cost out of it, there's still the cert cost.
And then there's the extra cost of having a separate IP for it. That is, unless you want to cut off all XP (and earlier)* users from accessing the site. Some may think this is a good thing, personally I don't see that deliberately cutting off a chunk of users just because someone somewhere things it's a good idea to "force" HTTPS on sites where it's not needed is warranted.
* I know that XP and earlier can't access SSL enabled sites using host headers to differentiate between sites. I have no idea what the situation is for other OSs.
And if you're a mom and pop business, with a site set up via a "click this button to deploy your website" service? Which developer is going to be fired? There isn't one. Websites don't all come from developers...
We are even teaching kids how to create their own websites in schools. Are we expecting them to go and pay for SSL certificates too?
I'm terribly sorry but I think I missed the bit where they explain that I shouldn't be visiting sites like http://www.catsthatlooklikehitler.com/ and http://thecatscan.tumblr.com/ because they're not https enabled.
If the site has no personalised content, it's the same for everyone (say wordpress with NO comments enabled), what actually is the point of the HTTPS?
It's horrendously more expensive compared to ordinary shared IP address Linux hosting. Before you even factor the certificate costs.
It means the stuff can't be altered in transit by an ISP or a malicious party. Think Verizon's client ID or the Chinese Cannon. Encrypted EVERYTHING is the most practical way to deal with these kinds of man-in-the-middle alterations, and a TLS-based protocol is the best option we have that's in wide use. Also, HTTPS has the big benefit that it's already in use, unlike Berners-Lee's proposal which is over 15 years old (RFC2817) and requires browser rewrites to support a protocol that doesn't exist yet (which may not be an option for old-but-still-in-use programs).
It gets you lots more than that. Your entire GET request is encrypted. So if you browse the BBC news site at work, currently your boss can track not just your use of the site, but also what stories your interested in. Perhaps you show a sudden interest in cancer stories and so you lose your job in some downsizing just in case you go off on sick leave. With https, all your boss knows is that you spent 20 minutes reading the news.
By hiding the metadata https benefits more than just users of sites that personalize content.
That's only if your corporate overlord pushes their MITM CA certificate to your browser's certificate store. As most browsers use Windows' certificate store this quite easy unless you use Firefox which will kick up a stink as it uses its own certificate store which won't have been meddled with.
"It means the stuff can't be altered in transit by an ISP or a malicious party."
Correction, if someone is currently intercepting your http traffic, they could have already setup a fake certificate authority server. This means, your https traffic is already intercepted. It also does not take in effect the proxy servers around the globe that already intercept https traffic.
Only half-joking...remember how the security enhancements of Windows Vista worked out in practice? A security nag dialog every time you scratched your nose with two results: clicking "go ahead and do your worst" became muscle-memory AND everybody wanted a quieter OS. For some MS users that was iOS or Ubuntu, for many more it was eventually Win7 (greeted like the Second Coming simply for not sucking so hard)
Piss users off enough and they'll even buy a new machine. But in the browser space? A quick non-disruptive download away will be Chrome, IE, Opera even...all eager to import bookmarks and provide hand-holding for the migration.
Another option would be for HTTP to follow the footsteps of FTP and introduce an explicit mode that allows clients to optionally step up to TLS mode over the native port and to allow mixed content.
With FTP-ES, a client connects to a FTP server on port 21 using clear-text. The client can then request to enter secure TLS mode. If the server doesn't support it, the client can abort or continue. Likewise, if the client doesn't support FTP-ES, the server can restrict access. The protocol also allows granular encryption of the control channel, the data channel or both.
In a theoretical HTTP-ES, a new HTTP-4xx code could be introduced that requires clients to step-up to TLS mode. It could require granular encryption of cookies, all headers, payloads or everything. Likewise, clients could automatically request TLS if it is trying to present a secure cookie to the server or if the user prefers it. It could also provide a CRC of the payload in the encrypted portion to guard against MitM tampering.
The benefits of HTTP-ES would be: no broken bookmarks, lower overhead when all you need is cookie or header obfuscation, increased protection against MitM attacks and some compatibility with external cache servers.
"The benefits of HTTP-ES would be: no broken bookmarks, lower overhead when all you need is cookie or header obfuscation, increased protection against MitM attacks and some compatibility with external cache servers."
The big drawback, as some ISPs have shown, is that even this initial handshake can be exploited to man-in-the-middle the connection BEFORE the secure phase can take place. About the only way you can prevent this is to start the connection with the key exchange and don't continue without it being complete; otherwise, that crack in the door is enough to get the proverbial foot in. IOW, don't do ANYTHING in the clear, not even a request to go secure; you MUST go secure from the get-go.
HSTS, certificate pinning, and opportunistic encryption more-or-less implement your idea. Well, opportunistic encryption did implement your idea but it's on hold in Firefox while the bugs are being ironed out.
As for the cost, you can use self-signed certificates with opportunistic encryption without the browser displaying any warning because with OE the important thing is the encryption, not the authentication.
So, unfortunately, the article's premise is completely wrong - the cost is a big fat 0, apart from a slightly increased encryption/decryption load at both the client and server end.
No browser supports OE apart from Firefox 37 which did for a few days until a bug was found and 37.0.1 was pushed out. By the time that this change that Mozilla propose is implemented, OE will have been brought back into Firefox anyway.
OE specifically allows self-signed certificates.
And if your browser doesn't support OE life carries on as normal under the http protocol.
I mean if you have certificate pinning, self-signed certificates are about as secure as official ones. Sure if an attacker can spoof the connection every time you have a problem, but then you don't get the problem of false certificates issued by rogue CAs.
I fear that this is only the beginning of the problem.
Creating self signed certificates is easy for a techie like me, but will be a nightmare for 95% of website creators who aren't IT pros.
Expect web hosting for non professionals to become significantly more expensive as hosting firms have to cope with implementing this in a user-friendly way. Or else, watch Firefox (my preferred browser) suffer an alarming drop in usage as it becomes "the browser that doesn't work anymore", or "the browser that constantly nags me about certificates" or "oh my word I can't believe I'm using Internet Explorer again".....
Expect also a rash of exploits whereby people offer to "sort this thing out for you" by advising users to automatically accept self-signed certs and then sharing around both public and private keys "to make things easier", thus defeating the whole point in the first place.
What do you mean with not handling well?
I created my own CA and am signing my own certificates with that. On my browsers and devices I installed said CA and all is very well. Yes, even in FF. Only Android older than 5 doesn't cooperate well.
Upside is that I can create new certificates as I wish and not have to touch every single client.
And as for difficulty, there are plenty of guides to be found for this 3 commandlines operation.
>> I created my own CA and am signing my own certificates with that
Which immediately singles you out as most definitely NOT the average user. Seriously, I know someone who is stumped if the browser window comes up smaller than normal (no idea how he got it that way) so it doesn't show his G-Mail Inbox ! The description for this problem was "the internet doesn't work".
Plus, that only works for devices and sites you have control over. So you install your CA root cert in your devices - that's fine, now do that for all the users of a site you want to use your self-signed cert on. What's that, you don't know who, out of the world's 6 billion people, might want to access your website ? Tough !
Put simply, in the general case (which is being discussed here) - if your site cert doesn't have a valid chain back up to a CA trusted by the client, then it's going to pop up the "here be dragons" dialog for the user.
This should not be a problem for end-users if Mozilla add some code to handle the site 'upgrade' from HTTP to HTTPS automagically.
Firefox could simply have an option (off by default but with info on how to turn on though custom error handling) to automatically switch to HTTPS URLs when the HTTP URL does not respond (e.g. TCP port timeout or connection refused).
Another major problem with https... I use a caching web proxy. This doesn't do any naughty "man in the middle" on https, so https content is not cached. I'm with those who say, if I'm going to some random site where the content is the same for everyone, what difference does https make security-wise? None whatsoever.
I second SSL certificates being excessively hard to install. I mean, I got it done, but it wasn't as easy as I thought it should be.
You can fix the caching with hashing. Request the hash first then compare with the hash of your cached copy. Easy to implement for static content (dynamic content you can't cache anyway). And as for ISP caching, screw them as they can alter those copies and produce false pages AND hashes. You want something, go to the source; it's the only way to be sure.
You can fix the caching with hashing
Oh, good, because HTTP caching isn't already a fucking overdesigned nightmare, with Cache-control and If-modified-since and ETags and a hundred other magical factors.
Though, to be honest, it's not clear to me how you think this fixes HTTPS issues with caching proxies. Or, technically, gateways, which seems to be what the OP was referring to, though it's hard to tell.
Because maybe not every site has credentials/logins or content that needs protection?
I have (amongst other) a simple site, mostly static content, no logins or confidential stuff. Have it hosted on the cheap, yet those cheap hosts get very expensive when the word 'SSL' drops.
Let them take care of the SSL Mafia first, and force SSL certificates to be cheap or free, then we'll talk. Now it is catching people between a stick and some hard rock, which makes it feel like FF is part of the Mafia
"I have (amongst other) a simple site, mostly static content, no logins or confidential stuff. Have it hosted on the cheap, yet those cheap hosts get very expensive when the word 'SSL' drops."
And without SSL, your content can be MITM'd. If for no other reason than because it's being transmitted in the clear so can be altered mid-flight.
Brendan Eich had a brain and was actually into technology for people. Other mozzleheads were more interested in pushing how technology _ought_ to be (their conception, anyway).
I have posted elsewhere about the 13-year-old bug report that was closed wontfix because of 2 mozilloids who "didn't like" the technology standard. Their proposed replacement hain't progressing in the 2-3 years since. Who would've guessed that?
I still use FF as my main browser, but more and more often resort to Chrome. Loyalty, when you have to fit into Mozilla's tire tread pattern, is a stretch.
Yes, like the shitty business of defaulting to US Legal paper size on every update on *NIX platforms. That bug has also been open for more than a decade. Maybe a small amount of time fixing stuff would bring more happiness to users than pointless dicking around with GUIs and pushing policies out that break things?
... security will go astray.
It looks that Mozilla doesn't understand certificates have a dual use. First, ensure the identity of a site. Second, encrypt the communication. Mozilla looks focused on the latter, but encryption doesn't make you safer if you don't know who you're talking to really.
Today getting a certificate is expensive because there *should* be some vetting before issuing the certificate. I'm sure the day certificates are sold for some pennies each, the "godaddy" of certificates will sell millions of them without the minimal check, just to make money, exactly how domain are sold to people using blatantly fake registration information. It will become easy to obtain a certificate for any domain you like, but maybe the known brands.
Then, certificates will become pretty useless for both ensuring identity and encryption.
"they'll still prevent man-in-the-middle alteration"
If world & dog just ignores dodgy or revoked certs (like Google do in Chrome) when so many stink and/or change for no good reason, then what is to stop an ISP doing a proxy with some self-signed cert for everywhere?
Well then, we're screwed, because Trent can ALWAYS be subverted by Mallory or Gene. And without Trent, we can't trust anyone, which means we can't talk to anyone in a paranoid world. We're either going to have to take a leap of faith or shut ourselves off, including physically since one can demonstrate that first contact is the most vulnerable phase of communication and the one that's impossible to fully secure due to lack of prior information (I suspect a paradox can be applied to this but I can't recall any specifics—trying to use a Trent brings up the "Quis custodiet ipsos custodes?" problem).
Forgot to mention network printers - how many of them support https? And how well is that going to work with DHCP?
What does DHCP have to do with HTTP/HTTPS? Oh yeah, nothing....
Also, you do a dis-service to systems admins/engineers by repeatedly writing that only developers can manage redirects and handling the nuances of making SSL work. I've seen both sides do their part and realistically, the sys/network camp method handles all transactions for a given name space far better than coming from code by using the tools that were meant to do it.
"What does DHCP have to do with HTTP/HTTPS?"
Don't certificates sign for a given IP address? What if that changes?
"Also, you do a dis-service to systems admins/engineers by repeatedly writing that only developers can manage redirects and handling the nuances of making SSL work"
OK so who patches old expensive colour A3 laser printers to add SSL support? Have you seen much sign of software patching/upgrades even for new/recent printers?
Don't certificates sign for a given IP address?
No.
X.509v3 certificates can carry a variety of subject-identifying information. IP address is not generally included, at least for the certificates used for TLS. These days, an X.509v3 certificate used to identify an endpoint for TLS will probably be matched against the entity offering it by comparing the FQDN against the value of a subjAltName extension of type dnsName.
There are many other possibilities; for example, in many cases TLS client-side endpoints are authenticated using certificates that identify a user rather than an endpoint. (Here the certificate's DN might be examined, or the server might simply have a previously-established database mapping certificates to user identities.) But dnsName-type subjAltName extension == FQDN (possibly with wildcarding) is the most common case.
This is an unusual set of comments.... and this is the first to mention the 3-letter word agencies in all of this. Since when have Google and Mozilla proven their independence from pressure by 'those people'? If you are going to have trouble scanning content in HTTPS-land it makes sense to encourage anyone whose commercial self-interest to 'own it all' is congruent with you own desire to 'see it all'.
It seems to me that we have to have an 'Easy-HTTPS' solution that is under the direct control of each everyone-and-his-dog and is compatible with HTTP public repository type sites. The moment you have just a few gatekeepers you can be quite sure that they will have masters identified by 3-letter words. they want to be and will ensure that they are the custodians of any custodians that we might create.... and you can be quite sure that they will represent only one of the global trading blocs.
No prizes for guessing which one, after all we are talking about protecting 'someone's' interests here and those, you can be quite sure, are economic in the final analysis.
The question to be answered before designing in security to the 'net is this "Is the Internet a 'global public utility for humanity' or does it belong to only those who love the 'land of the free'?"
Nice one Firefox, take the web away from normal people and make it business or techies only. Utterly disgusting and heavy handed attitude, I hope your browser dies a thousand deaths. Luckily people are quite capable and happy to vote with their feet (or mice in this case) so your fate will be sealed by this decision.
Left hand: I won't let you visit a site using SSL unless that certificate is signed by an authority that I already know, and I mostly know commercial authorities.
Right hand: I won't let you visit a site unless it uses SSL.
Real world: grow the fuck up. Not all sites need their traffic to be encrypted. If they do, accept self-signed certificates.
It's time to separate "protecting a site via encryption" and "validating a site via PKI". SSL is (incorrectly?) used for both.
How many root CAs are just some 3-letter agency? Plus, isn’t TLS compromised anyway?
How many businesses run their own CA’s so they can MITM all traffic leaving the corp network?
How many sites use CDNs for HTTPS, where the CDN is decrypting everything, eg https://www.cloudflare.com/ssl?
Given the routine disruptions to encryption in the millennial Internet, how different is the level of real security delivered by HTTPS and that of the TSA?
Even the article is wrong and this is supposedly a tech publication. There is no cost. You use two or three https technologies with each other and you will end up with a Firefox opportunistically encrypting the connection using your servers' self-signed certificate without complaining.
And the stick? That if you don't update your server's configuration then clients connecting with Firefox won't get new CSS3+/HTML5 features coming in future versions of Firefox. The stuff that the clients wouldn't have anyway because if you can't update a bit of server configuration you sure as hell aren't going to be using cutting-edge CSS/HTML.
And now that I have said my piece I shall flounce off in a cloud of righteousness.
How would this break the web, you said in the article yourself it would only bloke HMTL5 Content!
That means most the old sites which people don't update will be fine as they don't use HTML5, in fact the only ones which do are likely active, it won't stop people doing what they are doing now but it will mean it'll cost to use the new features essentially.
Maybe they should just focus on some of the features like GEO-Location and other potentially private or sensitive items as you suggest.
Mozilla is trying its damnedest! Ever since they "acquired" Netscape, they've gone down hill (now up to 35.0.8 as of 2200 edt 15/6/2 [after a restart])
What's a "descent" browser now?
IE, yeah right; Chrome, and goooogles data slurping overload; Safari, no thanx...
Can someone suggest a non-bias browser (no commercial interest/walled garden) like netscape USED to be, on a win XP/7 system?
There has been a free SSL certificate issuing organization around for many years: CACert.org
And, yes you can revoke a certificate for free.
Unlike organizations that use email only for certificates of possibly dubious reputation, CACert relies on a network of trusted individuals who must meet new applicants in person and verify that person's identity before signing off on a credential.
The problem has always been that the major browser distributions have refused to add the CACert root certificate to their default list of vendors. How did WoSign and StartSSL manage to overcome this stumbling block?
Can you get a certificate from LE? Today?
If you can't, then it's still currently vapourware
This really isn't the case. Do you think all announced games coming out are vapourware? Is Windows 10 vapourware? LE is scheduled to come out in September. It hasn't been dragging on for years of announcements and postponements. It's not vapourware.
"This really isn't the case. Do you think all announced games coming out are vapourware? Is Windows 10 vapourware?"
By my definition, until it's actually physically available, either downloaded onto my hard drive or in my hand as physical media, then it's vaporware. Until then, anything can happen, including a renege. So as they say, "I'll believe it when I see it."
In the post-Snowden world, where everyone from the NSA, GCHQ to your ISP is inspecting and sometimes altering content, HTTPS (which makes such things nearly impossible) makes sense.
In some cases. Not in all of them. Typical security amateurs - and that includes Gilbertson - failing to understand threat models and insisting that everyone agree with theirs.
Damn, but I'm weary of the HTTPS zealots.