Déjà vu
So, they want brownie points for reintroducing a feature that was standard across the board in 1995?
Google's tweaked the Data Saver in the mobile version of its Chrome browser, making images an opt-in luxury for those on slow connections. “After the page has loaded, you can tap to show all images or just the individual ones you want, making the web faster and cheaper to access on slow connections,” Google says, claiming “up …
Browser options, accessibility, & untick the auto-load images box.
It still shows the placeholders where the image is supposed to be & activating the placeholder causes the image to load, but otherwise your browsing speeds up considerably when you're not forced to grab every damned web bug single pixel image, web beacon, or half gigabyte selfy bullshit barfed into your bandwidth.
9 times out of 10 the dipshit that posted it didn't bother to include proper Alt Text (glares at you ElReg) so it means absolutely *bollocks* to anyone whom can't see the image for whatever reason.
So just turn the images off in the options & get the same result as Google's self congratulatory wank fest.
It's amazing how much faster pages load when you don't have to pull down a couple of megs worth of pics just to display ~5Kb worth of content.
“After the page has loaded, you can tap to show all images or just the individual ones you want,
How can you know which ones you want to see on a given webpage without seeing them? Something seems to be missing from this statement. I would think they'd at least show a thumbnail but the article and links don't mention that. Maybe the users are supposed to be psychic?
"...some idiot thought it would make sense to use some bloated Javascript from some other domain."
Hmm, swings and roundabouts really. You load jquery, google analytics etc from a well known defined third party as many sites do and it gets cached under that domain so doesn't need to load on each site you go to, therefore overall it should speed up your browsing experience.
As so many sites use jquery for instance, hosting them on each individual web server means your cache has hundreds of copies of exactly the same file, each having been downloaded individually. Also the standard per host connection limit that a browser may enforce (based on HTTP standards) can block further requests from downloading additional content, whereas serving some content from a different domain can allow it to be downloaded in parallel - especially if catered for to make sure it is downloaded non-blocking/synchronously.
However the site owner is then putting their visitors at the mercy of a third party, with the risk of malware injections, dns timeouts, third party failure etc. But on average you would expect that a large third party cdn could deliver the scripts quicker than your site can and probably has better security engineers than you.
I don't see DNS queries as the real problem. And I've given up worrying about JS libraries: hopefully Houdini will allow things like JQuery to get slimmer over time but the important thing is people letting the browser decide how to do things and put load as much JS as possible after the onLoad() event.
http/2 should bring significant improvements but as long as people insist on using multi MB big images for thumbnail previews then websites will continue to get slower.
So I'm guessing that you'd normally get just a bunch of local, mass-appeal type websites, but then have to subscribe to the 'Sports Package' to see football scores (subscribe soon to Sports Plus to get Cricket Scores and save up to 10% on your next bill!).
It seems that Google's / Facebook's plan is to set a precedent for lack of net neutrality so that as high-speed connections are rolled-out, no one would notice until its far too late. Much like how the cable / DSL companies managed to get regional monopolies established in the US.