Forget a wave of Web 2.0 threats taking down your software, stealing your data or exposing users - the real danger is posed by some existing attack techniques. And it's IT charlatans peddling over-night AJAX solutions that'll leave you vulnerable. Two security experts from Microsoft and Hewlett Packard have warned against " …
Surely it's common sense to never trust input from users regardless of whether it's ajax or not?
In the same way, you should never blindly trust programming tips or examples from strangers, in books or blogs. Unless you understand exactly what the code is doing (another form of "input" validation), it should not be implemented.
Anyone who needs the above pointing out should never have been hired to code in the first place. Actually, they need a good slapping if they go anywhere near code.
first rules of programming ...
1) Always check your inputs for validity (bounds checking etc)
2) Always check subroutine return or error codes
3) Test that rules 1 and 2 have been implemented
These were valid when I learned them back in 1982 and have not changed since. Whilst compilers or runtime engines can do some of this for you automatically, it never hurts to do your own checks.
On the client you can check inputs and provide fast feedback to users but that should never obviate checks made at the server as you can NEVER implicitly trust ANY client!
I don't think programmers are now any better or worse at creating secure code .. each generation still needs time to learn best if that is done in a classroom and not the workplace though !-}
Nothing like creating a market...
What all these books are saying is that if you write/use insecure code, then you will be insecure.
Any 'web' programmer who has any professional integrity will know about secure coding practices, and how to avoid security.
Any shill paid by a software house can say: Look, I wrote a crap application and then hacked it!!!1! Buy our special extra-protective tin foil helmets and this won't happen to you.
The only solution to this problem is market regulation, but I don't see that happening any time soon.
...don't always find the errors. Shouldn't it be "They are taking your back-end database tiers and moving them to the perimeter", not "parameter". Seems to change the meaning somewhat.
Interesting article though.
'Billy Hoffman, manager for HP software' security labs, added: "Companies will say: 'We can Web 2.0ify your existing applications in 15 minutes - we've got a wrapper'. These people are charlatans, and you should punch them in the face. They are taking your back-end database tiers and moving them to the parameter."'
So is HP actually advocating criminal violence, or is Hoffman providing personal advice here? Oh, and ditto on 'parameter s/b perimeter'. One doubts that a database of any significance could be efficiently stored in one parameter.
The moral of this story ...
If the bowels of your sales application don't check prices against a safe, authoritative source, and the code that hands over the goods makes no attempt to check whether you've actually been paid, you probably work for Microsoft (or HP apparently).
Why pick on AJAX? If you're sufficiently stupid, you can write insecure software using any technology you like.
The problem is that people are not always aware of what is user input. Most people new to web programming fail to recognise headers and cookies as user input. Some fail to recognise GET / POST data as user input if it's been 'hard' coded into the page.
reinventing the wheel with spokes on the outside
who would use AJAX anyway?
I can't use gmail from behind my company firewall (attaching files fails), I can from home. What sort of mission-critical technology is that? And this from a company with billions! think about UK PLC 'stumbling through' on small budgets, and human staff.
As for using any AJAX development frameworks .......
Actually, That company with billions!, can get it right. If you are having problems with a non-compliant browser failing, you can click on the link at the bottom of any gmail page that says: 'basic html'. This is true also if you have a proxy that can't handle the AJAX requests.
If you are having a problem because of uploads failing. Then this is not Googles fault, but most likely a broken network setup that is only manifesting itself with large requests, such as trying to attach a file.
I wouldn't say that it's always those new to web scripting. Some very big projects supposedly written by those with experience have suffered from seriously bad coding practice. phpBB and others storing serialized data in cookies for instance.
OTOH, there are also some new web scripters who never trust anything coming into their scripts and would never trust external storage in that way.
I'd say it's more good vs bad coders. A good coder will have at least some knowledge of the entire environment their code runs in. On the net, that means everything is "tainted". Everything from POST/GET right down to the remote IP address can be altered.
If you build from the ground up never trusting any input, you eliminate a lot of attack vectors just by escaping everything. SQL and XSS injections become extremely difficult to pull off. If you take it a step further and never trust the user to handle their own security, you build in requirements for certain password strength and brute force protections which make it a lot harder to brute force an account.
The only way to safely approach web coding is with a paranoid outlook, and anyone writing web code who fails to notice the amount of hacks being done against big (supposedly well coded) sites is either ignorant or dumb. Neither of those qualities are what I'd say make a good coder.
Not to say that mistakes wont be made in any code, but there is a difference between forgetting to escape one input and deliberately choosing to trust it.
Ever done a large system?
There is not much difference protecting security than protecting against other coders or other applications in system. It should teach everyone to design the part they do against all kinds of misuse - or do you want to go tracing, debugging, whatever your own part when someone else makes a mistake?
An advice, don't ever trust the spec. or defaults - sooner or later it will bite you. Unfortunately today the push to deliver is so high that even the best sometimes make shortcuts. I'm amazed that companies do that but maybe the short time benefits look too attractive, next quarter up, year from now - huge problems but who cares?
- World's OLDEST human DNA found in leg bone – but that's not the only boning going on...
- Lightning strikes USB bosses: Next-gen jacks will be REVERSIBLE
- Pics Brit inventors' GRAVITY POWERED LIGHT ships out after just 1 year
- Storagebod Oh no, RBS has gone titsup again... but is it JUST BAD LUCK?
- Microsoft teams up with Feds, Interpol in ZeroAccess botnet zombie hunt