That digital (fsck cyber word slobbering) Pearl Harbor is coming any day now. Forward bases have probably been setup for years.
Two first-gen flaws carried over to HTTP/2, warn security bods
Security researchers have unearthed four high-profile vulnerabilities in HTTP/2, a new version of the protocol. HTTP/2 introduces new mechanisms that effectively increase the attack surface of business critical web infrastructure, according to a study by researchers at data centre security vendor Imperva and released at the …
COMMENTS
-
-
-
-
Wednesday 3rd August 2016 19:01 GMT Dadmin
You are correct! I sat across from the Windows admins and the security folks at a hosting company I used to haunt(me am was sloaris/linux/unix admin, duh), and witnessed the release, detection, and reduction of this so-called threat in real time. All told, it was about a half hour when our top security guy, okay he was the only guy, basically had it nailed down and we were back to our projects(aka porn). Meh. Code Red was a bump on the ass. The new threats are far more cancerous.
https://en.wikipedia.org/wiki/Code_Red_(computer_worm)
-
-
-
-
Thursday 4th August 2016 04:15 GMT Kevin McMurtrie
Slow Read
It doesn't surprise me that HTTP servers don't have Slow Read protection because it doesn't work well at the server level. A single server can detect that it has too many connections from a single client but it can't see the big picture. It could be that one client has thousands of connections open but each server behind a load balancer sees what looks like a legitimate set of requests from a tolerably slow connection.
Ten years ago, the most difficult time for a server was late night when everyone finished dinner and dialed in with a POTS modem. I/O and CPU dropped to nothing but system memory maxed out maintaining all those HTTP sessions and socket buffers.
-
Thursday 4th August 2016 13:05 GMT Christian Berger
Well isn't that the whole point about HTTP/2?
I mean it's not faster on decently designed websites and real world connections or anything. The whole point about HTTP/2 is to greatly increase the complexity so the number of browser and webserver (library) vendors decreases rapidly, creating more bugs and making the found bugs applicable to more machines. After all if you only have 2 implementation, a bug is likely to be found in a huge number of machines.
It's time to admit that HTTP/2 is a bad idea.