It seems like a rather long winded way to say:
Security through Obscurity.
Startup Shape Security is re-appropriating a favourite tactic of malware writers in developing a technology to protect websites against automated hacking attacks. Trojan authors commonly obfuscate their code to frustrate reverse engineers at security firms. The former staffers from Google, VMWare and Mozilla (among others) …
Security through Obscurity.
Perhaps but the article implies that the code is continiously shifting and so has a better chance of staying obscure. Even if you manage to sucessfully attack the site once, it will change the next time you visit the page rending the previous attack less effective without any extra action by the owners of the site. Clever if it works.
Can't help but be reminded a little of the cyber warfare in GitS, in particular viruses that used the defenses of a system as an essential piece of their functionality. Let the arms race begin.
"Can't help but be reminded a little of the cyber warfare in GitS, in particular viruses that used the defenses of a system as an essential piece of their functionality."
That's how viruses in real life work.
I didn't paraphrase the footnote very well. The virus doesn't exploit weakness in the AV to bypass it and infect the system, the virus is a fragement of a full program and actually lifts the AV code to complete itself.
"The virus doesn't exploit weakness in the AV to bypass it and infect the system, the virus is a fragement of a full program and actually lifts the AV code to complete itself."
Yeah, that's par for the course in biology as far as viruses are concerned.
Not sure I'd like to debug those web applications. The logs would indeed by odd.
I think it would be like minified script debugging. Minified scripts are inintelligible, but for most technologies a dev can replace them with a non-minified version of the same code on the fly for debugging purposes.
I suspect the logs will be unchanged. I think this thing is basically a reverse web proxy that fiddles with the URLs. Perhaps a little like a NAT for http (shudder).
Mondays, Wednesdays and Fridays we've got SQL injections on offer and Tuesdays, Thursdays, and weekends we've got buffer overflows going cheap?
Either you program it properly or you don't.
Ok, clever clogs. Write me a thousand lines of bug free code.
It sounds like the software also does auto filtering of posted data to guard against SQL injections.
Probably other stuff too which I can't guess at :-)
"Ok, clever clogs. Write me a thousand lines of bug free code."
You don't need to be able to do better to tell when someone's cocked up. Try a valid counterpoint in future.
You are missing my point. In any large software project there will always be bugs.
Writing a large quantity of bug free code is nearly impossible, or at best, requires a huge amount of effort.
10 PRINT "Hello, world!"
20 PRINT "Hello, world!"
996 similar lines snipped.
9990 PRINT "Hello, world!"
10000 PRINT "Hello, world!"
There you go! Totally bug free!
(It doesn't do anything, but then that wasn't in the requirements)
Yes, writing bug free code is difficult and requires a large amount of effort but that's the only way to stop bugs being exploited. Randomly changing stuff doesn't get rid of the bugs, it just makes it harder to debug.
Rather than scraping the site and parsing hypertext directly, automate a browser. Find out where the relevant UI elements get rendered in the page and from then on it's "that input element at that position, whatever it's called and however many zero-margin DIVs it's embedded in."
And yay, yet more patents pending on software. Guess this'll be kicked into the long grass for the next 25 years, then.
This sounds like a neat trick to make malware writer's lives a lot harder... It won't be invincible, it isn't a substitute for well written code, but it could dramatically increase the amount of effort malware writers have to expend, which would be a good thing. It could help browsers to identify replay attacks as well.
On the downside it's going to break caching of web pages which could trigger an upswing in traffic. But on the upside it'll make traffic interception more interesting and hopefully a bit more expensive. ;)
Basically just means giving your form elements different names for every visitor. Have been doing this for years.
I don't need any more crap in the network racks when I already have the BGP routers, forward firewalls, load balancers, anti-malware engine, IDS/IPS system, web cache appliance, vpn gateways, rear-facing firewalls, packets shapers...
Typical Web 2.0 idiot programmer thinking: "I have no time to check my code for security bugs, I'm too busy inventing the next InstaSnapLinkedFaceGram+. Lets just make something to cover this up and make it the responsibility of the Dev/Ops team!"
..Would surely be sites offering products they don't want scraped for comparison sites etc?
As someone who does automated testing i was a bit scared til i saw the underlying code doesn't have to change, so it's still nice and static in dev...
Biting the hand that feeds IT © 1998–2018