back to article Planning to throw capacity at an IT problem? Read this first

Not so long ago the axiom “you get what you pay for” held true in IT. More expensive hardware and software generally produced better results. With the rise of open source and open standards, this is no longer the case. In today's world, if you'll pardon the mangled aphorism, IT isn't about how much you spend, but how you set …

  1. Anonymous Coward
    Anonymous Coward

    Errr

    If IT is a commodity, which to all intents and purposes it is, then throwing capacity at an IT problem makes sense.

    The prime issue is as systems get more and more networked, interdependent and complex, there are fewer people who can understand the impact of change. Lots of people have had sweat wiped from their brow by having odd copies of stuff all over the shop.

    Not scientific, but business continued for another day...

    1. StaudN
      WTF?

      Re: Errr

      "If IT is a commodity, which to all intents and purposes it is".

      No, you can't order a kilogram of IT.

      Commodity: "a raw material or primary agricultural product that can be bought and sold, such as copper or coffee"

      1. Necronomnomnomicon

        Re: Errr

        Nope, but you can buy or sell a gigabyte of storage, or an hour of compute, or a thousand DNS queries. Those things are definitely commodity.

        1. P. Lee

          Re: Errr

          >Nope, but you can buy or sell a gigabyte of storage, or an hour of compute, or a thousand DNS queries. Those things are definitely commodity.

          Not really. Yes you can buy "compute" but the key thing about commodities is that they are interchangeable regardless of supplier and they can be traded.

          As a business, you can't substitute an hour of compute time from AWS for an hour of Azure compute time - at least probably not easily. Got a TB of storage on Azure? You probably can't swap that out for AWS storage without having some adverse impact on your applications.

          Cloud companies like to pretend its all just a lump of generic compute and its a bit like trading iron or coal, but that's just marketing to make you think they aren't locking you in.

      2. Anonymous Coward
        Anonymous Coward

        Re: Errr

        A commodity is something which has no discernible difference between offerings between different suppliers and is essentially interchangeable. Oil, orange juice, coffee, and yes, to all intents and purposes IT. Printers, network kit and computers.

        IT is a commodity and has been for years. Thats why IBM, HP and now Dell are struggling. No-one can tell its their kit apart from the badge on the front.

        1. Lusty

          Re: Errr

          "No-one can tell its their kit apart from the badge on the front"

          I could tell a Dell out of a room full of HP servers. It would be the one that I had to manage very differently, it would be the one in a different management tool, and it would be the one with the different support number.

          Yes, from a compute point of view they are the same, but that doesn't mean that clever people would consider mixing and matching without a good reason. "Whitebox" systems especially are not equivalent since they don't have any of the value add that Dell, HP, IBM have created to make the admin's life easier.

          1. Halfmad

            Re: Errr

            My manager could tell the IBM one out from a mile off too, it'd be the one that cost twice as much as the others.

        2. Colin Tree

          Re: Errr

          IT suppliers spend a lot of effort locking their customers in to their product, making it too difficult or impossible to transfer to a competitor. (false monopoly)

          So it tries to not be a commodity when it really is.

          Proprietary formats, protocols, applications, user interfaces.

          You make heaps of money from a monopoly, but not a competing business model, which always becomes a race to the bottom.

  2. StaudN
    IT Angle

    "Throwing more hardware/software at the problem rarely no longer gives an organisation an advantage."

    Pardon?

  3. Mike Shepherd
    Meh

    "For decades"

    "For decades, we've been trained to solve IT problems by throwing capacity at them".

    50-60 years on, I think it's clear that the chief IT problem remains the difficulty of writing reliable software. After that, throwing capacity at it sounds good to me.

    1. Mad Mike

      Re: "For decades"

      "For decades, we've been trained to solve IT problems by throwing capacity at them".

      Not really for that long. At best, throwing hardware at the problem has been around for 20 years. No longer than that. I totally agree that throwing hardware at everything doesn't always work and often makes it worse. Same as taking copies of data simply solves an immeidate problem at the expense of much harder data management going forwards.

      There are three major problems hitting IT at the moment.

      1. Software complexity, quality and very poor testing.

      2. Agile delivery. This encourages a lack of thought (make it up as you go along), lack of proper requirements (guaranteed to kill any project) and unrealistic expectations from the business.

      3. The next great thing. Every new thing coming along will fix the world.............currently cloud (answer to everything according to many), but dozens of technologies from the past were billed the same. Every new technology or methodology or whatever is the answer to absolutely everything. In reality, they all have their pros and cons and none are the answer to everything.

      1. yoganmahew

        Re: "For decades"

        @Mad Mike

        "1. Software complexity, quality and very poor testing."

        Yes, yes, and, no, that missed the point. You can't test something correct. You have to design it correct first and then fix the bits you miss. If it is complex, it must be designed for complexity. Which is, of course, your second point, but I don't get to point 2 until the sprint 9...

        1. Mad Mike

          Re: "For decades"

          In order to get good quality software, you need to have good quality testing to pick up the bugs and problems you missed or introduced in your coding. From the bug lists I've seen over the last decade, testing is getting worse and worse, as is the coding quality. The number of bugs picked up in testing seems to be going up, but at the same time, when it hits production, you find even more than before. This suggests the testing was not as good and was failing to find the problems. Of course, a large amount of this is caused by agile development, which tends to result in a lack of testing due to the supremely fast turnaround always promised as a result.

      2. Colin Tree

        Re: "For decades"

        Capacity, what do you think a main frame was 60 years ago?

        I see it as centralisation of power. The PC challenged that and empowered individuals, but the cloud is taking the power back. The iPad is only a window to the cloud.

        The larger an organisation the more money it can make.

        The larger the city, the more concentration of wealth, the more large cities, yada, yada, yada.

        If a company makes $50 per employee hour, you need lots of employees, or work them 24/7.

        If a database of prospective customers gives a 0.1% sale outcome, you want a really big database.

        If you can make $100M in one country, you want to reach all countries.

        It's called simple mathematics. Accountants love simple mathematics.

        But really simple doesn't always work.

      3. Mark 65

        Re: "For decades"

        I disagree with your point on Agile delivery. Agile has its place it is just that it is often misused. Agile works well for tactical developments, prototyping etc and I have witnessed it work well in trading environments. However, like everything, you have to use it appropriately. Large, plenty of thought, lots of planning, non-Agile projects also go arse up. No method is perfect.

  4. Steve Davies 3 Silver badge
    Coat

    you can never have too many backup copies of that vital bit of data

    Just in case one gets corrupted/overwritten.

    Coat, with an 8TB spinnnig rust drive in the pocket (inside a case..)

    1. Mad Mike

      Re: you can never have too many backup copies of that vital bit of data

      More backup copies can never harm you (apart from your wallet), but the issue is copies of data used by different apps (i.e. not backups). Each copy has a distinct tendency to become another master and anarchy ensues.

  5. Michael H.F. Wilkinson Silver badge

    ONLY throwing capacity at any problem might not be optimal, but with the sheer volume of data processed in many applications, capacity is certainly needed. At the risk of winning a prize for pointing out the bleedin' obvious: you need to find the right tools first, and then assess how much compute power you need, given the optimal tools. Big data really requires you to think hard about tooling. I have seen people throw weeks of compute power of a large section of a Blue Gene machine at a problem that the right algorithm could solve in a couple of hours on a desktop PC. This wasn't even a big data problem, but it really demonstrated the difference in performance between O(N2) and O(N) algorithms.

    Of course, in many cases you must also wonder whether you really need all that big data. Bigger isn't necessarily better

  6. chivo243 Silver badge
    Coat

    simply using what is to hand effectively is what is called for

    Just going to take a phrase from David Byrne "Stop making sense!"

    What do you mean we need to train our users....

  7. jake Silver badge

    Throwing capacity at the problem ...

    ... only leads to code bloat.

    1. Buzzword

      Re: Throwing capacity at the problem ...

      Often true; but sometimes you end up writing incredibly complex code and spending weeks optimising to get every last ounce of performance out of the system, when simply throwing more RAM into the box would have solved 90% of the performance issues.

      This situation usually arises in government-type organisations, where the budget for man-hours has been approved but there's no budget for additional hardware until the next refresh cycle.

      1. Lusty

        Re: Throwing capacity at the problem ...

        " throwing more RAM into the box would have solved 90% of the performance issues"

        RAM is almost never the bottleneck in 2016. It was the bottleneck in 1996, I'll give you that, but these days almost every server I see has way too much memory allocated resulting in the operating system caching the entire system drive and anything else it can get its hands on. CPU is quite often a bottleneck, because too many people think a 2GHz 12 core is faster than a 4GHz 6 core. Storage is quite often a botteneck too. Memory though, is only useful as a temporary cache either to improve read speed (from SSD?) or for the working set, which is limited to what the CPU can process. Using memory as a write cache is poor design unless the data is worthless, otherwise you'd have consistency troubles to deal with.

        1. quxinot

          Re: Throwing capacity at the problem ...

          It's very difficult to throw a $unit of "de-suck-ify" at a software problem.

          A cattleprod may work when applied appropriately.

          If it doesn't work, you'll still feel better about things.

    2. Anonymous Coward
      Anonymous Coward

      Re: Throwing capacity at the problem ...

      ... only leads to code bloat.

      You have a trade-off to make:

      1. At one end sits simplistic code with resources thrown at it

      2. At the other more complex, efficient code that makes better use of resources bit is harder to support and maintain.

      You need to optimise hardware vs wetware costs. It also has a hint of Sherman vs Panzer about it.

  8. Anonymous Coward
    Anonymous Coward

    I was loving this write up (I even brought a book off Amazon) then you mentioned "Big Data" and my Buzzword Bingo filter kicked in and you lost all credibility in my mind.

    No body loves a bandwagon

    1. Mark 65

      No body loves a bandwagon...except a consultant.

  9. Anonymous Coward
    Anonymous Coward

    "The Dispossessed is a novel about what happens to a society that refuses to use big data analytics for even the most critical problems"

    I haven't read the book, but from the description it sounds like what they need isn't Big Data Analytics but a just modicom of basic Project Management.

    Surely you don't need to go to "Big Data" to estimate how much fresh water your population needs, or calculate the number of desalination plants required to generate it?

    1. Anonymous Coward
      Anonymous Coward

      3litres per day (human requirements) x population count + some surplus for wiggle room + Population growth as a percentage. Divide result by (output of desalination plant in litres per day) + contingency of site failures/disaster/terrorism

      No Big Data needed....just some maths yo!

  10. Stefan_Minkey

    Cloud computing makes it WORSE. Now its possible to implement all kinds of data or architecture quickly without thinking about how it operates along side the actual business it is supporting. Cue - "we have a knowing how to make this stuff work for us effectively" crisis in 5,4,3,2,1,...

  11. Fehu
    Pint

    Kind of like the Peter Principle

    As a company gets larger it makes more stupid decisions: Open source tool in place 9 years; decision is made to move from physical server to virtual; idiots can't configure VM's correctly so app that previously handled everything thrown at it starts to crash on a regular basis; instead of correcting the bad configuration, a new app is brought in that will be leased at $500K per year that does not do everything that the old app did, so we're still looking for the second app to buy or lease. Lots of times when someone tells you how much they spent for that shiny new thingy, they're just telling you that they have more money than sense.

  12. Anonymous Coward
    Anonymous Coward

    Commoditization has always been here, most just overlooked it and were too busy "following the money" to get involved in it.

    Top management is often busy following the money, making decisions on buzzwords and actng on fears of things getting stifled in technical details.

    Middle management often tries to juggle following the money with having to meet direct goals for which very often technical heroes are carried in without much regard for processes or sustainability. Let alone Commoditization.

    Workfloor often obsessed with "following the money" to keep managemeent on board and trying to juggle it with dodging problems and remaining acceptable. In some corners commoditization starts acting up (storing files in dropbox rather than on home-drives for example), but in most cases it is just a mess. Bussiness decisons often then driven by "technical persons".

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon