back to article Over half of all apps have security holes

More than half of all software applications failed to meet an acceptable level of security, according to a study based on real-world code audits by application security firm Veracode. Around 57 per cent of applications failed to pass muster when first submitted to Veracode’s cloud-based testing service. A similar 56 per cent …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    Hardley suprising

    People who install "apps" willy nilly deserve to be hit. The words testing and security jump to mind.

  2. Chris Miller

    And in-house developed web apps

    are often even worse from a security perspective. I think this is because what most organisations want from their web apps is that they must:

    a) perform - web pages lose half their readership for every second they take to load (or some other statistic I've just made up);

    b) look pretty - preferably with lots of insecure moving pictures, but see (a) above;

    c) be finished today not tomorrow - we're operating in web time now, people!

    If security even figures on the list, it'll be pretty low down - and anyway there's no simple way to evaluate security and the tools developers need to build it into the testing cycle are expensive.

    Until some or all of these things change, we won't be seeing much improvement in web security.

    1. Galidron
      Stop

      Interesting

      I must have read this part of the article incorrectly.

      "Applications developed by third parties had lower security quality than those developed in-house, failing to reach acceptable security standards 81 per cent of the time, according to Veracode."

  3. Neal 5

    @ John, the author of the story

    Really, I must confess my own naivety, that you would actually consider this to be news or even newsworthy.

    Perhaps we do ALL live in the dark ages afterall.

  4. Nipsirc
    FAIL

    Or to put it another way...

    ...Veracode's code testing service only found security holes in half the software submitted to it. Nah, that's not such a good stat, let's leave that one out of the thinly veiled advertisement.

  5. Doug
    Linux

    software applications security

    What OS do the vast majority of these bad-apps run on and shouldn't security be built into the Operating System.

    1. PsychicMonkey
      Pint

      I would go the other way

      and say that security should be built in to every application. Do you trust the security of your application to the OS maker? I don't....

    2. Anonymous Coward
      Troll

      Yawn...

      ...yet again another linux troll with no understanding of what they are on about.

      Lets rely on the the OS as Linux is 100% secure, correct?

      http://www.linuxsecurity.com/content/view/153338/169/

      oops

      In fact if there are no problems, I guess this page would be blank.

      http://www.linuxsecurity.com/content/section/3/170/

      but it's not is it?

  6. Anonymous Coward
    Thumb Down

    meh

    "Developers continue to focus on functionality, quality, and speed of release. Security slows down The process and often gets overlooked."

    Fallacy. MANAGEMENT continue to focus on this, because they don't really know anything but how to "manage" people, they push security off the agenda seeing it as un-needed in their opinion. They set the emphasis for their dev teams.

    Lack of security comes down to one thing, the cost of doing it properly. Otherwise we'd have grown a culture of doing it right by now instead of sending all our requirements offshore where all the previous lessons of how it should be done weren't learned. Developers in my experience are always happy to learn new stuff and once it goes through QA and security testing, theyre happy to learn round their mistakes and rectify them before it gets to v1. That implies investment in specialist resourcing to test, and qa cycles. Both of which get thrown out as too costly by the beancounters not understanding the value.

    The trail always ends up with someone at the company who wrote it skipping adequate testing and qa to save money. Then it ends up as a big spaghetti mess while a load of accountants keep trying to find different ways of asking if having a remote root exploit in firmware is really a issue until they find someone who gives them the answer they want (that itll be ok really). Quite often they've bought in a codebase from offshore to save money and don't even HAVE the expertise or a way forward to seek a fix.

    Please can the tone of this article put the blame where it really lies? Developers deliver what management demand of them or they're fired, and any ultimate failure of the system must lie completely with the managers. IT done on the cheap...

    1. Tom 13

      @AC 23-Sep-2010 11:52 GMT

      They could put the blame where it belongs, but that wouldn't help Veracode sell their software and services now would it?

    2. Galidron
      Thumb Down

      First Draft

      The article implied their numbers are based on the first time a developer submits their code through the scan. It seems to me to a bit ridiculous to expect code to be 100% free of security bugs before it has been analyzed. It is easy to miss something, that's the whole point of using a tool like this.

  7. Anonymous Coward
    Grenade

    Oranges are not the only fruit

    "Cross-site scripting was the most common vulnerability..."

    Errr... this is very much a html/web issue only. There is more to software than "the web" despite what the marketing people at MS etc may have us believe.

    On a general note, why is anyone surprised to hear that 50%+ of all applications have security holes in them? The general standard of software that I've seen over the last 25+ years suggests to me that it should be closer to 90%. Basically, even leaving security holes aside for a moment, a LOT of software is just plain shite! Go and pull off some random bit of code from the wobbly web and take a look at it (maybe for your fave app of the moment). There's a very good chance it will be badly written, the documentation is usually non-existant or misleading, it will be convoluted, bloated, poorly designed (I use the word loosly) and generally, ...well ...shite. I woudn't trust most of the software out there to run my toaster, never mind a bank.

  8. Is it me?

    No surprises here

    The cost of actually testing applications to any reasonable level usually exceeds the budgets available, thus protection is applied at server or network level, how ever in a cloud that's a bit more difficult. A long time ago when OO was just becoming mainstream Powersoft (Now Sybase) used to state that all object tests must cover all possible triggers on the object, regardless of weather they are coded or not, because you can never be sure what will happen if you fire a trigger, or that a coder hasn't put a back door on CTRL/Alt/Shift/click.

    I'll bet most of the wholes are a whole lot simpler than that. And are scripting languages designed to be bomb proof, probably not.

  9. Psiinon

    Zed Attack Proxy

    As always there are lies, damn lies and statistics.

    But various similar studies have found that there are vulnerabilities in a significant proportion of web applications.

    Fixing the root causes will not be not easy, but will involve a combination of better developer training, static source code analysis tools, automated scanners and pen testing.

    One thing that I've found is that most developers consider pen testing to be a 'black art', but I believe that if you dont know how people will attack your app then you wont really understand how to defend it.

    So heres a bit of self promotion :)

    I've released the Zed Attack Proxy (ZAP): http://code.google.com/p/zaproxy/

    Its a pen test tool explicitly aimed at developers (and functional testers).

    Its free, open source and cross platform.

    And involvement in the development of ZAP is actively encouraged!

    Psiinon

  10. Anonymous Coward
    Anonymous Coward

    testing

    before the developer vs beancounter debate starts, can i suggest that until testing is more common than it is and security testing is part of that, then relying on developers to use their own good practices will let through those who don't have their own good practice (yet).

  11. Charles 9

    Security, meet the budget.

    I frankly don't see how it is at all possible to build a properly secure program when practically all projects have a budget and/or a deadline. Especially when pennies are being pinched and projects are becoming due *yesterday*. It's the classic dilemma of being told to do it right and do it fast at the same time.

    Reminds me of a couple separate incidents in the early 80's: both involving a videogame being produced by a then-popular company called Atari. The games had SHORT deadlines (they MUST be out in time for Christmas shopping), so the developers did their darnedest. Unfortunately, no one really liked the games. One of them even garnered a rather sour reputation even today.

This topic is closed for new posts.

Other stories you might like