Re: Cant win...
> Or does that sound too hard?
No, but if that solution didn't bring its own problems, it'd be a common design.
> Use a local proxy to cache all your remotely collected scripts. Have that proxy run a comparison check against the last known good version for all external scripts.
If you locally proxy then you need to deal with all your traffic. Troy Hunt's blog about this story mentions for his site today that this would be 500GB of minified http data just for jQuery if he didn't offload it to cloudflare CDN. That's costly. You can make all sorts of arguments about how people shouldn't use all these libraries, that we should jump right into notepad to develop our static pages, but that kinda misses the whole point.
Another downside of your local cache model is that your performance will be crap if I'm not geographically close to your server. Most of those problems disappear if your are delivering off googakaflare. Even if you put your own site on a CDN, it's still a download overhead to your first time visitors.
You haven't specified how you differentiate a new good version Vs a new not so good version. Your technique tells you when a script changes but so does SNI, and you can do that in 30 seconds.
The problem with local caches or SNI is that they solve this problem, but block solution to the counter problem. Imagine that instead of being pwnd, this library vendor was privately reported an XSS bug that allowed your (and any other site using the same library's customers) private data to be exposed. Without SNI and with everyone referencing a common version, that XSS flaw could be immediately fixed across millions of websites. With SNI and private caches, you can't do patch management in that way. That is the crux of why it is a hard problem.