back to article Software sucks these days - and just maybe it's all YOUR fault

Every now and then, I write a blog article that could probably get me sued, sacked or both; this started off as one of those, and has been heavily edited by myself to avoid naming names. Software quality sucks. The "release early, release often" model appears to have permeated into every level of the IT stack; from the buggy …

COMMENTS

This topic is closed for new posts.
  1. Steve Davies 3 Silver badge
    Paris Hilton

    Say no to Beta grade stuff?

    Well that excludes just about everything that comes out of Redmond, WA.

    The latest window patchs are a prime example. 125Mb... most of this is IE-9 and .Net V4. Buggy software these guys release without a doubt.

    Paris bacause she just loves watching herself on 'Betamax'.

  2. Trevor_Pott Gold badge

    CSC

    Corporate Stable Code. Outside of Redhat - and maybe IBM - we're going to get that kind of commitment where?

  3. jake Silver badge

    First of all ...

    ... There is no such thing as "software". So-called "software" is the current state of the hardware.

    My hardware runs better today than it did thirty years ago.

    Your "slow software movement" is more properly called "kit that works".

    I suspect your real issue is with marketards pushing kit that is broken by design ...

    1. Anonymous Coward
      Anonymous Coward

      Re: First of all ...

      I think you need to upgrade from the valves and gears kit you're currently running

      1. Anonymous Coward
        Anonymous Coward

        @AC 08:06

        Oh, I don't know. Steam punk is enjoying quite a resurgence lately.

    2. Anonymous Coward
      Anonymous Coward

      Re: First of all ...

      There is no hardware. Once you know this, then you'll see it is not the hardware that runs, but yourself.

    3. Anonymous Coward
      Anonymous Coward

      Re: First of all ...

      Nice try. Since you don't understand what software means to IT people, please stop posting. You're not likely to post anything of value. As indicated by past posts.

      1. jake Silver badge

        I find it amusing ... (was: Re: First of all ... )

        ... that none of the AC[1] replies to mine seem to have a clue as to my meaning. I also find it sad. Seriously, kiddies, there is more under the hood ("bonnet", to you Brits) than "apps".

        [1] I also find it amusing that all replies are AC ... :-)

    4. Anonymous Coward
      Anonymous Coward

      Re: First of all ...

      "My hardware runs better today than it did thirty years ago."

      Lucky you, mine needs Viagra.

      ...oops, wrong hardware

  4. Anonymous Coward
    Anonymous Coward

    Totally

    I'm plenty disgusted by vendors who don't even release a changelog and upgrade your servers without telling you, breaking your business. Thanks wankers...

  5. TeeCee Gold badge
    Facepalm

    "...the amount of fundamentally broken code that has made it through to us is scary."

    There's an easy explanation for that.

    Back in the day, your code would have been written by someone who a) knew the language it was written in, b) knew the system it was targetting, c) had a basic understanding of the problems and processes it was meant to address and d) knew what was and was not good practice in the environment it was written to run in.

    These days it'll have been written by some clueless graduate in India who knows Java, has had a two day cross-training session and been given a spec relating to something he's never heard of.

    The think the bugs are bad? Try looking at the code......

    1. Anonymous Coward
      Anonymous Coward

      Re: "...the amount of fundamentally broken code that has made it through to us is scary."

      I had a job doing software reliability testing but when I tried to apply the standards of quality and repeatability I had been drilled in when testing hardware I got shouted at as it made the product look bad since it got lower performance numbers than the previous version. When I re-ran the tests on the previous version using the same methodology it showed the new version was much better, but apparently that was even worse as it made the previous group of testers look incompetent.

      The sad thing is the code was actually good, just the testing methodology was geared towards marketing rather than engineering.

      Anon....because the code really was good and I don't want to upset them

      1. Anonymous Coward
        Anonymous Coward

        Re: "...the amount of fundamentally broken code that has made it through to us is scary."

        Another problem is legacy, code where I work has gone through various iterations. Starting out with MINT code, then had C built on top of that, then C++, now C#. We have tools built in Eiffel and VB, we've started using XML.

        Each time a new language is added, our coding standards seem to have changed, the old code is effectively abandoned and left in a state of limbo, with much of the code untouchable, not just because it's the core of the program, but because the person who coded it used the strangest of coding methodologies deliberately overcomplicating things so that they couldn't be easily replaced, only to then quit.

        The original code is now over 20 years old and it still holds up the back end. All we've done is put bubblegum and paperclips over the top to keep it held in shape. And because we're only software in what is mostly a hardware company management won't really accept the resource drain to redo this legacy software from scratch. So we're stuck patching up fixes which we can't actually fix because doing so would break several other things.

    2. Michael Wojcik Silver badge

      Re: "...the amount of fundamentally broken code that has made it through to us is scary."

      Spare us the historical fantasies. I've been a professional software developer since the late '80s, working in a variety of areas - mostly system software, with some application and some embedded work, on platforms ranging from PCs (and expansion cards for them, hence the embedded development) to mainframes. There have always been lousy programmers.

      Yes, software is a bigger industry now, and that's dragging in more unqualified staff. That's part and parcel of the other factors Glassborow identified: increased complexity, market forces that encourage frequent releases and discourage testing, the declining cost to manufacturers of providing fixes. But there was no golden age when all programmers were competent, and certainly not one when they were all sages who could produce code every time that was free of "fundamental" errors and fit for purpose.

  6. Anonymous Coward
    Facepalm

    Ah...

    So you've seen the Firefox update cycle then too? :D

    I'm sticking to a 6 month or yearly update on FF, devs be damned!

    1. Mike Flugennock

      Re: Ah...

      So you've seen the Firefox update cycle then too? :D

      Seriously, man... I'd just gotten settled in with Firefox 4 when suddenly, I turned my back for a minute, and they were up to version 7. And, what are they at now... version 14, something like that.

      That's just nuts, man. What the hell are they thinking?

  7. Paul 87
    Alert

    I'd offer an explanation that it's the exponential growth of computing power that has made developers sloppier in their approaches. Whereas once upon a time, they had to watch every resource used closely, and not following best practice led to programs that failed to run entirely, now we're in a situation where "good enough" code runs and error handling routines pick up the slack.

    Additionally, the comercial drive where companies have realised they can make money if they constantly sell to new customers, and care little about retaining or placating the old ones, then there's less financial pressure on the developers to fix anything other than the most critical of bugs, and instead they become focused on developing new features for the software instead that look good in a sales demo

  8. Anonymous Coward
    Holmes

    Ask to see the design documentation ...

    For any major upgrade where you are spending non-trivial amounts of money, ask for the design documentation. In most hardware companies they can probably so you something. In most software companies they probably look shifty and mumble something about having Doxygen for the APIs.

    Most of this stuff is heavily stateful, probably has had threading bolted on as an afterthought, security toss in at the last minute for a good measure, and I can guarantee that 95% of it has never gone near any form of design review board at any point in its life. Writing code isn't hard, designing software which works unfortuantely is. In most places the "powers that be" don;'t view design as valuable, and get twitchy when the code line count isn;t going up fast enough.

    It's like complaining a civil engineer doesn't turn up on day 1 and starting to drive piles for a skyscraper. "Hey mate, that looks a bit wonky", "Don;t work we'll sort it out after release".

  9. The Alpha Klutz

    software sucks because the sales people arent selling any software

    just their jumped up notion of brand identity and other such nonsense. any actual product that gets made is incidental.

    1. HamsterNet
      Thumb Down

      Re: software sucks because the sales people arent selling any software

      As a software sales person, I can tell you we just sell whatever crap comes spewing out of the dev team. If its utter tosh or amazing we dont get a say, just a big Sell this amount of this or look for another job...

      If its good we can sell it based on what it does and how it does it.. If its crap then its all bout the branding and marketing.

      Which is the same in all markets, ever wondered why LV bags look so plastic cheap but yet sell for hundreds.

      Because suckers always buy into brands...

  10. wowfood

    Are users to blame? We are a lot more accepting of poor quality code. We are used to patching everything from our PC to our consoles, cameras and TVs, especially those of us who work in IT and find it relatively easy to do so.

    Yes and no. The problem is, users don't know we're getting this beta grade stuff until we get it. I'll use videogames as a prime example here. I have previously bought games at launch which have been amazing, I couldn't find a single bug. Other games, I've bought them, they've had a fair few bugs, a day 0 patch and then a second one a few weeks later and it's all fixed. And further games still which come out buggy and stay buggy.

    So what choice do I have? Read reviews? According to most reviews games are flawless, I've read a review before where the guy gushed about how amazing game A was, when game A was released and I bought it, it was unplayable with the amount of bugs on it, and I wasn't the only one who thought so (so many forum topics)

    So what if I wait? Will that fix things? But then I risk missing out on Game B which is bug free from the start. And how will I know if bugs ahve been fixed yet or not?

    I kinda wish there were a "bugged games" website for new releases listing how buggy the games are.

    But it's all well and good telling people to "say no to beta quality software" but when we don't know the software is beta quality until we've found that one review that wasn't paid for, or until we've paid for the game ourselves then it really is unavoidable.

  11. Steen Eugen Poulsen

    I have some thirty years of coding experience.

    Looking at my old code and it calculates 2+2 just fine, but it present the output in binary. The program is totally useless in the modern world. We have learned a lot in the past 30 years and can do much better.

    Try comparing something like Zork to a modern adventurer game, the difference is gigantic in the amount of data and code Skyrim or whatever adventurer game deal with, compared to Zork's where you could write down every choice on a small piece of paper.

    Our current technology doesn't allow humans to deal with the complexity of modern software development in a way so they can make bug free code within a reasonable budget and time frame.

    A simple thing like doubling the number of developers on a project, doesn't mean you get double the amount of code written. We have reached a human complexity bottleneck that no one has yet managed to break through.

    1. John King 1
      IT Angle

      Alternatively

      Or we may just have a new generation of coders who just slap their code on top of existing framework, drag and drop their usual functions without a thought for efficiency and have no processes in place to test code performance. I've seen people often rely on new hardware or a move to the latest programming language for their speed increases - and ignoring their bloated code.

      Step back a generation and every coder was extremely careful about every single byte. You may only have had 32K to play with which is the size of a big CSS or readme.txt file these days.

      1. AceRimmer

        Re: Alternatively

        But the problem is that if every coder was still careful about every single byte we'd not be able to progress. It would be akin to a house builder worrying about how to manufacture bricks, cure wood, forge steel and make glass.

        The fact is modern requirements are a lot more complex as expectations have been pushed up over the years.

      2. The Alpha Klutz

        Re: Alternatively

        CPU cycles are much cheaper than people.

        Most software is unremarkable. store a bunch of numbers, show them to user x when he logs on, send out an email, its not some NASA shit they're working on. An engineer might write the code efficiently but a businessman realises it doesn't matter. by the time you've finished writing it efficiently CPUs are already 10 times faster again...

  12. http://www.theregister.co.uk/Design/graphics/icons/comment/thumb_up_32.png

    Software's catching up

    Seems to me that this just means that software's finally catching up with the construction industry, where the norm is build it (ignoring any bits of the design that might be inconvenient, difficult, required), wait for the client to notice (and complain), then knock it down and do it nearly right. In the words of Flanders & Deanna, "It all nakes work for the working man to do". Or is that unduly cynical?

    1. dajames
      Joke

      Re: Software's catching up

      In the words of Flanders & Deanna, "It all nakes work for the working man to do". Or is that unduly cynical?

      Certainly unduly uncygnical!

      (An icon of a swan would be nice, for Swann)

  13. Anonymous Coward
    Anonymous Coward

    Funny

    Funny to witness this process again and again: There is the CFO being lulled into the false sense of cost cutting nirvana by outsourcing and offshoring software development. How great, these buys cost only a fifth of what these money grabbing developers here want. This Acca trained guy may be the man with the money, but of course has no clue how coding actually works. So, a separate system needs to be set up to ensure that people half way around the world cannot copy intellectual property that was painstakingly developed over years. The code which is then produced is so sloppy and infantile that senior developers need to review, correct and test it. Before they have the time to do the static source code analysis the sales VP is huffing and puffing he needs that code, and he needs it now. Does it do da trick? Good, publish it and ask questions later, dis' a manadshment decishion!

    So here we have it: outsourcing that generates more total overhead and costs, bloated business processes that are only 20% effective, code that has knowingly or unknowingly bugs, vulnerabilites and performance issues...the list goes on.

    Now, what are product life cycles strategies, long term maintenance and release planning, enterprise archtiecture, granular charging structure....

    Anonymouns for obvious reasons.

  14. Anonymous Coward
    Anonymous Coward

    I thought it was all "Unit Testing" these days?!

    I've been "shopping around" for a totally Free and Open Source alternative to .net, and every language I looked at, every book I've skimmed, every web framework or RIA setup has stressed the importance of Unit Testing, leaving me with the impression that my Cowboy Coding technique was woefully behind the times. Is it?

    In my world, the developer has a quick play to see if the code does what it's supposed to, the Support Team meddles with it to see if they're happy (they're the ones who field the support calls from irate users after all) - and when we're all satisfied it's usable, the changes are released.

    If bugs are found later, the users are used to it :) .... and it's usally a simple fix with a quick turnaround, so it's never seemed to be a big problem to either party. If a real show-stopper crops up, so long as we can replicate the problem we can usually get a fix to the client in half a day or less.

    Given the complexity of software, is that really so bad? In my field it's not life or death, or time critical on the whole, so the really rigorous coding alternative seems to call for so much extra time spent developing, time that could be used more productively adding more features. Paid-for features, billable and bringing in cash :)

    1. Anonymous Coward
      Anonymous Coward

      Re: I thought it was all "Unit Testing" these days?!

      Unit testing... meaning you have a test for your method, which passes on some inputs. Congratulations! Your method which can take a continuum of inputs passes for one of them!

      Unit testing is good. Designing your code to be testable by unit tests means it's more likely to be correct in the first place. But it can absolutely give a false sense of security. You still need proper integration testing.

  15. Anonymous Coward
    Anonymous Coward

    Buildings

    "It's like complaining a civil engineer doesn't turn up on day 1 and starting to drive piles for a skyscraper. "Hey mate, that looks a bit wonky", "Don;t work we'll sort it out after release"."

    A TV programme a few years ago followed the building of a skyscraper. It was interesting how many times there were crises as the building progressed - and quick workround solutions had to be found.

    Such problems lie in several areas. New materials, new techniques, new people, and new environments. Even when someone has done something successfully many times - there are always constraints waiting to be unexpectedly breached. Sometimes there are two existing errors cancelling each other out.

    1. AceRimmer

      Re: Buildings

      "Sometimes there are two existing errors cancelling each other out."

      An Error and an Anti-Error

  16. Vladimir Plouzhnikov

    Possibility of endless updates

    The Internets is wot does it.

    The developers got lazy because it's so easy to say "Internet connection required". Then they can push half-baked code out of the door. And if there will be bugs? Oh, we'll just fix them later in updates! Because of the internet connections they can push updates every day if they so inclined.

    Using an obligatory car analogy - you get you car now, but the wheels will send by mail later, oh, and about that oil filter - it's not ready yet, just drive around without it for a while...

    The users are to blame as well, of course, because they meekly accept this ludicrous model.

  17. Paul Anderson
    FAIL

    Dishonest and Just Plain Wrong

    "It does leave me wondering, has software quality gone downhill? " Definitely! I've worked in the infrastructure and support biz for 20 odd years and see a steep decline in software quality control. It's so bad that with some packages, even from big vendors for infrastructure, features and even the install script itself can be Dead On Arrival. It never ceases to amaze me the garbage developers hoist on customers. It's dishonest and just plain wrong. The car analogy is an old one, but still absolutely valid. Would you tolerate a Ford dealer telling you that you have to wait for the MK2 upgrade just to get the stereo to work ? We (customers, sysadmins, buyers, support) are far too tolerant.

    1. BlueGreen

      Re: Dishonest and Just Plain Wrong

      > It never ceases to amaze me the garbage developers hoist on customers

      No, blame the customers for accepting shite at the cheapest possible price every time. If Customer didn't willingly eat garbage then Manager would not willingly sign it off because there'd be Consequences.

      Maybe I'm being cynical but maybe people are dumb consumers and maybe I'm not being cynical.

    2. Anonymous Coward
      Anonymous Coward

      Re: Dishonest and Just Plain Wrong

      Please don't blame us developers... some of us are perfectly well aware that the software is unfit for production, and even risk our jobs to say so before it's shipped anyway.

  18. Anonymous Coward
    Anonymous Coward

    It's Agile's fault

    It's all about the Agile process. There is no design or time for it so you build as you go make the design decisions you need at the time you need them. With requirements and APIs being constantly negociated it's no wonder that what used to work is now broken and ships as such.

    1. Ben Norris

      Re: It's Agile's fault

      That is Agile implemented badly.

  19. PM.
    Thumb Up

    I recall a HP printer that needed a firmware upgrade straight out of the box , because , well , it didn't print.

  20. Gordan
    Thumb Up

    Scary?

    "The "release early, release often" model appears to have permeated into every level of the IT stack; from the buggy applications to the foundational infrastructure, it appears that it is acceptable to foist beta quality code on your customers as a stable release."

    I couldn't agree more, and my experience completely matches yours. The pragma of "release early, release often" is what people say when they actually mean "release early, release broken" but don't dare say it for fear of causing controversy (albeit controversy that badly needs to be caused).

    "Just to make things nice and frightening I "like" to, every now and then, search vendor patch and bug databases for terms such as "data corruption", "data loss" and other such cheery things."

    Just sign up to a few file system development mailing lists if you want to see scary. Particularly scary in the context of supposedly stable file systems.

  21. Anonymous Coward
    Anonymous Coward

    lazy devs? Surprise!

    1. Anonymous Coward
      Anonymous Coward

      Re: Lazy devs

      Dev's I know who used to write it right are pissed off because they aren't allowed to write it properly these days. The management are just interested in how fast they can get things out the door. The cheaper the dev costs, the more profit. The company can't equate sales with SW quality so in their minds this relationship holds true.

      I was recently asked to write some in depth training material for some SW that wasn't written yet. Having worked with the company in question for 25+ years, I thought no problems. Send me the reference spec and I can write the initial take on that then go through and sort out any discrepancies at the end, once the SW is available to test.

      Errrrrr reference specs?

      Errrrrrrrr Oh we don't do those any more.

      I presume MS work this way, I can't think why they'd agree to $1M a day fine from the EU till they handed over the specs for a protocol if it wasn't for the fact that they didn't have the spec of the protocol in question.

  22. J.T

    I recently was visited by a storage company that had yet to have a hard drive failure (good thing too, no online spares and proactive sparing limited to SMART monitoring). Then there are the folks on my team dealing with Oracle ZFS, keeping them under 80% utilized while the system heat and softare raid kills an unusally large number of disks while getting the historically awful Oracle software support.

    You get what you pay for.

  23. Mike Flugennock

    Slow Software Movement?

    Oh, f'crissakes no, Martin... InDesign runs slowly enough as it is.

  24. ItsNotMe

    And this goes hand-in-hand with what Apple does...

    ...have their users be Beta Testers for their HARDWARE. Happens all too often these days.

  25. Psymon

    I'd agree with Paul 87 and therums about the Internet, but I'd like to submit XP as another factor

    My reasoning is thus:

    Before XP, home and professional markets were completely separate, and their two methodologies as alien to each other as carbon and silicon based life forms.

    If you were designing software for NT, then your target market was clearly identified as a networked, business environment, and you designed your software appropriately.

    This meant compliance with networking and security standards. Your software had to be resilient and flexible enough to cope with the myriad of network configurations, ACL restrictions, and of course, you are answerable to your multinational client with its army of lawyers.

    If you were writing software for the home market, on the other hand, it was much more of a Wild West. Games were dumped in the root of C: so that they could be quickly navigated to in DOS, and rules were merely standing in the way of you gleaning a couple more FPS out of your game.

    You were actually rewarded for bypassing standards, blitting the hardware and taking shortcuts.

    Along came XP, and these two worlds collided with such force, we are still feeling the chaotic repercussions today. When the NT kernel became the platform for both, XP was flooded with rule breaking games, and hastily banged out code by teenagers in their bedrooms.

    This quickly gave rise to the situation we are all familiar with. You had to run as nothing less than admin for all your software to work. This quickly bore a vicious circle, with small developers, lacking the resources to fully research all the intricacies of the NT platform, simply making assumptions that this should be the norm.

    As evidence I submit my time as sysadmin in a school, 5 years on from XP release. The niche software, sometimes written by programming teams of one, would make a security consultant break down in tears, often storing config files in the windows folder, ignoring the registry, making assumptions about profile folder rights…. I could go on… and on…

    Even Mozilla are guilty of many similar faux pas, which is why you don’t see any real corporate take-up. The sudden influx of lazy and/or hacker coders gave birth to a compromised NT environment that lasted more than a decade, giving rise to an entire new generation of coder who believed that this was the way things should be done.

    I’ve only recently seen a change in trends with the proliferation of Win7. If the UAC comes up at any time you’re not installing NEW software, the programmer has done it wrong. End of story. The UAC is embarrassing a lot of corporations to go back and write it the right way, but we’ve still a long way to go.

    Perhaps Win8s Android-esque declaration of rights at install time will push things further in the right direction?

  26. Super Timmy

    Software Quality? Who cares?

    First, let me state that, obviously, software should be of good quality. As an infrastructure person you should expect that. But let's be honest, who else cares about that?

    The developers can try to do their best to make sure it works, but they don't have access to you incredibly complex production environment. Besides, they won't get any penalties because your environment can't handle their software anyway. You might tell a doctor what other medicines you're taking before they give you new (fully tested) medicine, but you won't tell the developer what other software you use.

    The buyers (aka your boss) don't care if the software is good either; they just want it to be installed into your environment and to be able to work with them. Oh, will it crash your other applications? Doesn't matter to them. Oh, could you also support my new new shiny Somephone 5 as well?

    Neither the buyers nor the sellers thinks the quality is important. If you want to change that, you could start by telling them. :)

  27. Dazed and Confused

    Why bother to make it better

    Since SW makers have convinced themselves and seemingly the world, that normal laws don't apply to them, why should they bother to make it any better. You "agree" to a license that says the only right you have is the right to hand over your money on their demand.

    Until they are forced to accept that they are responsible for failures in the products they expect to be paid for (an act normally referred to as selling) why the F*&^ should they worry if its sh*t it won't cost them anything.

  28. Tim Wolfe-Barry
    Thumb Down

    So when will Software Development become Engineering?

    I think (and have for a long time) that Software Development is stuck about where Brunel was with Civil Engineering.

    We can see the possibilities, but we haven't yet worked out the best way to do stuff and so everyone reinvents everything each time.

    Back in 18xx Civil Engineering was exciting because people didn't know how to build large strong bridges and so did the best they could. Then, when the bridge fell down in a storm (Tay Bridge, or Tacoma Narrows more recently) they did it better the next time.

    Nowadays Civil Engineering has progressed to the boring position where we can build something like the Padma bridge - any consulting civil engineer could tell you how (or give you a max of 2 choices) and the actual construction gets farmed out to one of several global consortia.

    We're still at the 'Bridge falling down' stage; but frustratingly seem unable to learn any lessons or improve matters the next time and probably won't until some real disaster strikes and the losses from cr*p software equal the casualties from earlier 'real' disasters.

    1. The Alpha Klutz

      Re: So when will Software Development become Engineering?

      im not in a hurry for it to change

      You said it yourself, the work will be farmed out to global consortia who will employ a limited number of the world's best engineers and an army of grunts. They probably won't employ you or your friends and if they do the work will not be fun as you will apply x formula to y problem or get fired. With emphasis on the firings and shitty treatment, as no one really cares how good an engineer is anyway.

  29. chris lively
    Linux

    Software devs are between a rock and a hard place.

    Once a project is released, there is immediate feedback about what to add. So, the devs start working on that. However in our fb/twitter world users aren't content to wait a few months for everything to be properly tested prior to releasing an update. Instead, they will quickly jump ship. So, you release faster.

    Corporate wants the job done in as cheap ass a way as possible. So we hire guys with very little experience or offshore to some other group that has no idea what is really needed.

    All of that points to a ton of pure crap being produced and "shipped".

    Hell, I had a call with a client two days ago that was demanding a new feature. I told them we'd put it in our next release scheduled for 30 days from now. ( small feature but we like regression testing ) They threatened to walk if it wasn't finished this week. The sales team never told them we did this; it was a completely new requirement that just materialized out of their mouths. So, what do we do? We put the damned feature in now, then publish and pray.

    So, are users at fault? Partially. There is enough blame to certainly go around.

  30. seismofish
    Happy

    a Slow Software Movement?

    "Perhaps it is time to start a Slow Software Movement that focusses on delivering things right first time?"

    Um, that'll be Debian then...

    1. jake Silver badge

      @seismofish (was:Re: a Slow Software Movement?)

      Not Debian. Slackware.

  31. Mike VandeVelde
    Holmes

    crap software is definitely a group effort

    You have 2 camps of extremists. On one side are people who say quality will naturally evolve in the hive mind setting and any trying to force it is a waste of effort. On the other side you have people who say you should be able to launch big money lawsuits at Microsoft every time Windows crashes and wipes out their Powerpoint presentation. Most of us make our way in the real world, which is a vast grey area in between.

    I really don't think the devs can be blamed, no matter how poor the quality. Most often someone else considered them acceptable and hired them, someone else produced the requirements (or not as the case may be), someone else set the deadlines, someone else made the decision to release.

    Devs shouldn't do their own quality assurance anyway, beyond a bare minimum. How does that old saying go, any person can write software that is so sublimely elegant that he or she can't see any bugs in it at all.

  32. Anonymous Coward
    Anonymous Coward

    Bend, don't break

    Far better that you get something which is good enough and evolving than the massive failures where people are left with nothing but a huge bill eg. government IT. Companies can get instant feedback on what is important to fix or improve first and so overall end up offering a better product, more tailored to the majority of their users without all the guesswork. Redgate are a fine example of using Agile techniques and built in user feedback to drive development with a beta or live userbase and how successful it can be without having to be shoddy at the same time. (I am not associated with them whatsoever)

    In the current online world things are often beyond our control and we are forced to design defensive, degrading systems. Core functions and data are protected while it doesn't matter that much if some UI element or nicety breaks for a short time. Servers can go offline and the system works around them, rather than relying on one machine that is too big to fail. This is gradually coming over to the mainstream. It is hard to design loosly coupled structures with where features can evolve as required and not end up in tangled mess of code or legality (or leave it too open for your competitors to step in and benefit from your hard work). But people are learning how to do this well.

    The biggest barriers are actually not technical at all. Marketing people like showy releases with large changes all at once. Businesses like to buy a one off product that they guess will be suitable for a while rather than a service that will continue to meet their needs. Consumers like to be one version up on their neighbours. All of these drive a model where changes are drastic, the lag to feedback is bigger, time required to find and fix bugs is longer, pressure to release without spending that time is greater.

  33. Anonymous Coward
    Anonymous Coward

    'Software Assurance' and similar schemes are a part of the problem. My area is Citrix, and on top of the license cost, Citrix pushes people to buy 'Subscription Advantage', which means that if a new version of the software is released while you have active SA, then you get a free upgrade. So if you pay thousands for SA on top of your licenses, and then no new version is released, you're going to be a bit miffed. So they've pushed themselves into increasingly more frequent release schedules so they can justify the SA sales. As a result, every new major release seems to be increasingly flakier than the last. They were forced to lengthen the lifecycle for some of the products earlier this year after a customer backlash.

  34. Wil Palen
    Facepalm

    It's the suits stupid

    Judging from the way the contracting world works, I'm not surprised.

    With few exceptions, all these contracting bureaus / 'headhunters' do is count the technical terms on your CV and match them against the terms on the function requirements. The highest number wins, regardless of actual competence (which is of course harder to measure, but hey, that should be their job)

    So guy #1 with 2 years experience in framework A wins over girl #2 with 5 years experiene in similar frameworks B, C, D and E...

  35. ServiceVirtualization works

    Martin:

    You've missed the point, friend. The problem isn't with hasty or shoddy software development. Development must be hasty for enterprises to keep up. The problem is with the development process, which must be transformed. I've blogged more about that here: http://servicevirtualization.com/profiles/blogs/hey-martin-the-software-s-not-the-problem-it-s-the-development-pr

    Just some food for thought. More people need to find out about this technology.

    Cheers from America!

    Mike

    1. Michael Wojcik Silver badge

      Sigh.

      Your silver bullet is just as leaden as the rest of them. Can it improve the situation for some software projects? Sure, but so can a thousand other things. Will it solve all quality problems, or even most of them? No. For one thing, it butts up against the same economic, sociological, and psychological issues that other technical solutions do, such as the diminishing rate of return on software quality after a certain (low) point and resistance to tool adoption and use. And like any black-box testing method, its domain won't include failures that aren't caused by contracted inputs, such as interference from unrelated processes.

      My employer sells software-quality tools too. I think they're pretty good, for what they're meant to do, and I think a great many software projects could be improved by employing them or their competitors. But I'm not going to pretend that they solve the "software quality problem". No one class of intervention - technical, methodological, educational, professional - will do that. It's more than likely that nothing will, though the situation may eventually improve somewhat.

      (And incidentally, your site fails to degrade gracefully if scripting isn't enabled. That's unacceptably lazy. Your web team ought to read Platt.)

  36. Michael Wojcik Silver badge

    No silver bullet

    It's depressing how many commentators want to point to a single explanation for poor-quality software, whether it's incompetent developers, agile methods, developer or machine resources...

    There isn't a single cause. There isn't even a primary cause. That's why, as Fred Brooks and countless others have insisted, there isn't a silver bullet for fixing the problem. There are many contributing factors, in all the nodes of the graph: in development, in management, in sales, in marketing, in customers, in users, in resources, in methods, in designs - in everything. I'd argue that you can't even discuss the problem coherently without engaging in the economics of the industry; the sociology of software development and consumption; and the psychology of developers and end-users.[1]

    This isn't one of my primary areas of research, but I still have a shelf full of books on the subject: The Mythical Man-Month, AntiPatterns, Why Software Sucks...[2] It's not something that's going to be diagnosed, much less solved, by J Random IT Guy.

    And there's nothing in this article, or in the comments, that hasn't been said - and researched - before. That doesn't mean they're not worth reading, of course; we need popular, off-the-cuff treatments of topics because no one has the time to learn about more than a handful of subjects in any depth. But there won't be any groundbreaking happening here.

    [1] There's a ton of high-quality research in all of these areas. I have a handful of books just on how technical workers communicate, for example, like Winsor's Writing Power and Spinuzzi's Network. These are actual case studies of workers in large corporations, not the handwaving and anecdotes of armchair pundits.

    [2] If you only read one book on the subject, by the way, I recommend Dave Platt's Why Software Sucks. Mythical Man-Month is a classic and very important, but it's more about managing the development process. WSS is anecdotal, opinionated, and peripatetic; it doesn't even pretend to try to be comprehensive, and it's written at least as much for end users as for developers. But it's highly readable, even entertaining, and sets the reader on the right path.

  37. Anonymous Coward
    Facepalm

    Slow software

    I do agree, I'm not keen on the tendency to develop far faster, usually. Firefox for example, I still can't move my bookmarks around, how can you go through tens of major releases and not fix that. But the excuse these companies always use is that they have to do this in order to compete, which means people are buying it, STOP BUYING IT!

  38. Hoosier Storage Guy
    FAIL

    SDDC

    A very true article, hence why I'm not exactly excited about the new catch-phrase....software-defined data center.

  39. Atonnis
    Devil

    You actually hit the nail on the head without meaning to...

    The problem is - as the article asks - Users.

    Not the admins, or the IT guys (except perhaps those who lust for the days of the hairy, fat, scratchy-belly dork in his unwashed t-shirt living in the basement), but the Users.

    It's all implied in that one word - Users - the word that has become synonymous with 'oh crap what have you DONE to this machine?', 'uh-huh...you just clicked on the link and clicked 'Yes' because the window popped up and said you had a virus and now it's my fault your system is slow and keeps crashing', and 'you installed Google WHAT?!!??!?'

    Once people were given the knowledge and access to install stuff themselves, it all went downhill. No longer were there standards and minimum quality levels, or testing of software with an eye to reliablility, longevity, cost or quality. Suddenly, everyone thought they knew better because 'they had installed that toolbar at home and it was so useful to them there'.

    As annoying and frustrating that the Windows UAC stuff can be, in retrospect of the last few years, it's been an absolute blessing in non-wasted business hours. Granted I can't get paid sitting around waiting for reinstalls and updates to happen so much, but then I'm weird - I enjoy my job and want to learn and do more.

This topic is closed for new posts.

Other stories you might like