Feeds

back to article The IT kit revolution's OVER, say beancounters - but how do they know?

One of the great problems within economics is in trying to work out what's a structural change, what's a cyclical change and what's being buggered up just because you're not measuring it properly. For example, we can look at how much companies (or more accurately, non-household entities) are spending on the computing …

Efficiency gain

Two points:

First, you touch on investment vs consumption.

I suspect that this is a big part of things.

There is a genuine change in the market: it has benefits, buy it messes up statistics.

If I used to buy a software package and charge it as an asset then (somehow, sometimes...) that is "investment".

But if instead I use Microsoft Office 365 and pay per month, then it is an expense.

Or if my company bought a Blackberry that was an asset - but if I take out a contract for BYOD and expense it each month it no longer is.

In other words, the statistics could be completely correct: investment is falling because cloud is delivering genuine efficiency / productivity benefits.

The second part is genuine efficiency.

Some of those assets just vanish.

Some of course don't.

I no longer by a Dell rack for email server if I use Google hosted email (no asset on my balance sheet) - but Google buys servers instead (so an asset & investment).

BUT there are incredible efficiency gains: they probably pay 50% of the price I paid, and economiers of scale (trunking gain) they get far better uitisation.

4
0
Silver badge

Tim, you mention "there was a vast amount of money wasted in dealing with Y2K".

Why are you perpetuating this myth? It gets trotted out a lot as being an example of a non-event, by journalists in general but even by IT insiders who should know better.

The fact remains that had there not been a lot of time and money spent behind the scenes, the Y2k bug would have been a disaster, but it was averted by a lot of hard work.

Unfortunately, because most of the work done was behind the scenes, the general perception is that it was a load of fuss about nothing.

If you mean that the money was wasted because it shouldn't have been necessary to correct the short-sightedness of those who programmed stuff using a two-digit year then I agree with you, but I doubt that's what you meant.

27
1
Silver badge
Thumb Up

Tim, you mention "there was a vast amount of money wasted in dealing with Y2K".

Why are you perpetuating this myth? It gets trotted out a lot as being an example of a non-event, by journalists in general but even by IT insiders who should know better.

It's very similar to the arguments used by people when they claim you've wasted your time on something because things are no better. 'We pumped the water out of the ship for three hours but it still sank so it was a waste of time' or 'All the effort spent on road safety and still people are being killed and injured'.

In both cases people are failing to understand that sometimes maintaining the status quo is an achievement in itself. I'm sure there's a name for this kind of argument fallacy. this, perhaps.

Of course the Y2k argument is slightly different. It looked like nothing needed to be done precisely because we did such a good job of fixing things.

6
0

Mixed messages from Y2K

Alister, in my experience both you and Tim are correct.

There was a lot of time and money spent on Y2K. Much of this was necessary, and had it not been spent then there would have been real problems.

There was also a lot of hype, wastage and bandwaggon-jumping. Also upgrades or expenditure that was difficult to make a business case for if stand-alone was rolled in to the Y2K "gold card" / FUD factor.

I was a programme manager on Y2K - hence the grey hair!

7
1

Averted disasters are beyond the beancounters' understanding

"The fact remains that had there not been a lot of time and money spent behind the scenes, the Y2k bug would have been a disaster, but it was averted by a lot of hard work."

That's how it always works if beancounters meet IT people... It's apparently a lot easier to justify big spends on disasters that have actually happened rather than preventing them.

For that reason many companies have poorly maintained sites and code and infrastructure in production use, and any attempt by eager developers, system and network admins to get some cash for long overdue (disaster prevention) maintenance is rejected ("no budget for that")... until a big disaster happens, and then management suddenly asks "what can we do to prevent this from happening in the future"? But nothing ever changes, although the answer is bloody obvious.

13
0
Bronze badge

That's not quite what my original said

I followed that remark with a parenthesis (and let's not get into the argument about whether or not it was well spent now, shall we?).

Editors edit, as they should do, and that part they, quite rightly (for Editors are editors), dropped.

1
0
Silver badge

Re: That's not quite what my original said @Tim Worstal

So please let us know what got missed out right here. And let us exactly get into the argument about whether or not it was well spent; you make a claim, you justify it.

Further:

> then we splurged way too much on dotcoms

but in a previous missive you said

> The first point to make is that no one can decide what represents value for other people. Other people get to decide what is value for them. I might well have decided, for example, that Facebitch is of no value whatsoever and I might even be right about that. The fact that 1 billion people disagree with me, however, means that it does have value for them

So which is it? Without consistency how can you possibly apply the word 'science' (dismal or not) to economics.

And yet further:

> Companies are, in general, flush with cash, and money is, compared to recent times, damn near free for corporations to borrow

As ever you measure in solely money, and yes of course it's important, but by far the biggest lever in the IT toolbox is knowledgeable people, not tech kit. Software and hardware is a tool and if you can't use it effectively, it's value is hugely diminished. And it's often underused/misconfigured/misunderstood/neglected and generally wasted. Source: years of working in this industry. Other sources: other commenters here will tell you the same.

5
0
Silver badge

Re: That's not quite what my original said @Tim Worstal

On 2nd thoughts, the "and yet further" part, though arguably valid, may be beyond the remit of economic investigation so feel free to ignore it. I am most curious for your response to the rest though.

0
0
Silver badge

Y2k

Apart from the (very necessary and largely effective) recoding of antique COBOL (and other) code necessary to avoid Y2k errors, there was also a huge surge in spending (mostly on PCs). Some flim-flam merchants tried to persuade PHBs that their PC fleet would all turn up its toes and die on 1/1/2000 - this was (almost entirely) a lie. Some IT/infrastructure managers, took advantage of 'free' (at least, in budgetary terms) money to refresh their PC fleet, leading to the aforementioned surge in PC spending, followed by a compensating drop over the next few years. I know, I was that IT manager.

3
0
Bronze badge

Re: That's not quite what my original said @Tim Worstal

This is the parenthesis that got dropped: (and let's not get into the argument about whether or not it was well spent now, shall we?).

0
0
Silver badge

Re: That's not quite what my original said @Tim Worstal

You're rather good at avoiding inconvenient questions.

1
0
Silver badge

And a lot of money was wasted (or injected into the economy if you are an economist):

On audits - going around ticking off printer serial numbers

Preparing W2K compliance certificates - it's a printer FFS

Obtaining assurance of Y2K compliance from manufacturers -

Replacing it with a brand new expensive inkjet because Epson don't have a W2K compliance cert for an ancient dot matrix

Having staff on double time over the holiday - in case of any issues - for a completely non-critical product that nobody would be using until after the holidays when any problems could be fixed.

4
0
Silver badge

Re: That's not quite what my original said @Tim Worstal

> then we splurged way too much on dotcoms

but in a previous missive you said

> The first point to make is that no one can decide what represents value for other people. Other people get to decide what is value for them. I might well have decided, for example, that Facebitch is of no value whatsoever and I might even be right about that. The fact that 1 billion people disagree with me, however, means that it does have value for them

So which is it? Without consistency how can you possibly apply the word 'science' (dismal or not) to economics.

Didn't the markets decide that we splurged way too much on dotcoms by spectacularly crashing?

Or have I missed your point?

1
0
Silver badge
Headmaster

@Alister: Indeed, one could perhaps illustrate this idiocy by the following, (imaginary?)......

..............conversation.

A: I am really pissed.

B: Why is that?

A: I have paid fire insurance for a whole year.

B: Why are you pissed about that?

A: My house hasn't burnt down yet, total waste of money.

5
1
Bronze badge

Re: The popular "Y2K Myth" idea is based on ignorance.

There is no argument because some of us know the cold hard facts and the facts are available in the written records of many companies.

Y2K projects were vital to prevent widespread disaster in business world-wide.

There is absolutely no debate about that amongst fully informed people.

Sure there was probably some scamming going on:

1. Managers getting legitimate work done (at higher Y2K wages)( now rather than later under the guise of it needing to be done urgently.

2. Sales people falsely telling companies that perfectly good machines would not work after Y2K.

Also prices of programmers went up and the prices consulting firms charged went up hugely during Y2K, but that wasn't a scam, that was simple supply and demand. Supply is constant while demand goes way up, so price goes up.

So sure, some scamming, probably less than 5%, perhaps much less than 5% since there were so many legitimate ways to make money if people only had the time. But the overwhelmingly great majority of the Y2K expenditures were absolutely vital to continued business.

I worked for 4 client companies in Winnipeg and Toronto during Y2K.

My guess is 95% all Y2K projects were necessary for business to continue, 4% was legitimate work done early using Y2K budgets, and at most 1% was actual fraud.

The Register should lead the way to ending the popular lies that IT types invented the Y2K Myth for personal profit.

0
0
Boffin

Y2K waste

Certainly, Y2K was a non-disaster because of all the investment. Much better than the slow-motion wreck of IPv4 exhaustion.

But there was also a lot of waste. Equipment was bought or replaced for Y2K when it really didn't need to be. For example, I remember someone claiming that my 486 would not be able to boot in Y2K unless I bought his custom RTC. This was not the case.

0
0
Bronze badge

My take is that Y2k fueled a lot of hardware investment 1998-1999, while needed, far exceeded the normal/average amount of spending one sees in 3 year/1 third of hardware replaced each year system. The overall economy fairly boomed in the years leading up to Y2k, and tech spending quickly receded for a time after that.

el rupester's comments about cloud usage skewing the investment numbers is right on, Big Cloud is spending less than its customers would individually. Short of some Y2k neccesity situation or some dramatic advance in tech, I would think that tech hardware investment will show "normal" growth compared to the past. Also, there are far fewer existing companies with no computerization fueling new investment as there was in days gone by. Maturity has it advantages and drawbacks economicaly...

0
0
Silver badge
Coat

"The IT kit revolution's OVER, say beancounters..."

Fantastic, Mr. Beancounter - let me introduce you to my friend here...

KZZZZEEEEEEEEERRRRRRRRRRRRRRTTTTTTTTTTT!!!!!!!!!!

2
0
Bronze badge

With you there Tim

10 or 15 years ago many company accountants would have had to record all IT spend as an asset otherwise they wouldn't get approval and the balance sheet would look crap. Then it was written down at 10 to 25% a year so never hit zero (really - I had many computers in skips which still had a value).

Yet the day it got installed it became essentially worthless (in the sense you couldn't sell it for anything like the same money). Part of this comes from the habit of dealing with machinery and cars etc of course, which do retain a value for longer.

These days we don't captitalise anything IT - written off as expenditure on day one, and we buy in smaller chunks (rolling replacement) exactly as you say.

4
0
Bronze badge

Re: With you there Tim

If everyone's doing as you are then that section of the GDP figures would be zero. So obviously not everyone is but nice to see that there's at least something to the intuition/suspicion.

1
0
Thumb Up

Re: With you there Tim

The falling costs are a major part of it. My organization doesn't even track assets below a certain size. I haven't asked the beancounters what the exact number was. So, a $10,000 PC in the 1980's would probably find itself on an asset sheet, but a $700 convertible laptop would not.

0
0
Bronze badge

Compare with marketing spend

Consumer goods companies I used to work for would sometimes "invest" in marketing (TV adverts for example) to build a brand. All the money was on revenue budgets, but the aim (and indeed the effect) was to achieve more than a short term revenue goal.

Increasingly I see IT spending falling into this category too. Especially with that dreadful misappropriation of the term "digital".

1
0
Silver badge

I'm with Autor

The paradigm shifts have happened. Standalone->networked->integrated systems. We have the networked systems and humans are out of the loop for pretty much everything where they could be out of the loop. The paradigm shifts provided order-of-magnitude efficiency gains, rather than just incremental improvements.

What is the hot tech now? Virtualisation. That's not a process-improvement thing, that's just admin reduction. VMware is sitting pretty, because the tech is still difficult, but the large costs will drive the freebie tech as Google and Amazon look to reduce their costs.

As I said, I'm with Autor - the massive investments of the past brought order-of-magnitude improvements. No-one is offering that now, not even the fabled cloud. Cloud engineering is so much overkill and failures result in such poor headlines. It relies on correct contention assessments Consolidation can only go so far before it runs into human fraility. If anything, consolidation increases complexity and frailty.

On the subject of MSOffice, I don't think it has improved my productivity for quite some time, yet the cost of buying new versions is recurring. I'm not sure that's very sustainable. Even Exchange/Lync which has added new features and integration doesn't really add much to my productivity. Here is where the beancounters are getting their information. What's my ROI on this expenditure? Why should I not just keep using the existing software? The answer increasingly is, "we have no technical need for new features or to upgrade, but we have to keep in support." or "the vendor has declared the product obsolete."

In this scenario, you pay as little as you can to keep the tech ticking over - maintenance mode.

2
0
Gold badge

Re: I'm with Autor

I'd disagree with you on cloud not having a huge productivity effect. Or at least the potential for large impact. There is still a very large section of business that has only partially computerised. This is the small business sector. Companies too small to manage IT, even when they could afford it.

Cloud CRM is now available, and although this is a bit of a time-sink, can be incredibly powerful for some companies. Cooperation, communication and sharing tools, only previously available to big companies, can allow a lot more working from home. I know this, because these are things that my small company does. Which means we spend more on IT than we do on our office. Theoretically, we could ditch the office. But it's nice to meet for real sometimes.

There's an awful lot of new areas that computers might take over. For example, 90% of the building services industry is still managed by blokes turning things on and off. Or increasingly, not doing so. As caretakers and maintenance seem to have been dropped in the hopes that magic fairies will keep all the plant maintained.

...Some people may appreciate the fact that my auto-correct has replaced cloud with clown, every time i've used it in this post...

2
0
Silver badge

Analysts

are just worried that their pyramid scheme, sorry, the stock market, is going to collapse before they have made enough money - and they are greedy b*stards, so they never have enough money, so they worry when markets mature and the gold grabbing slows down with it.

3
0
Anonymous Coward

Lease versus purchase analyses

For decades the accountants have been doing analyses of leasing versus buying in order to put the figures in the desired column. This includes the option of buying, selling to a nominally independent entity, and booking the lease payments as expense. There's considerable leeway for the lawyers and accountants, and considerable motivations around corporate financial strategy, taxes, R&D credits, etc., which I'd hope the economists understand.

0
0
Silver badge
Coat

"there's four parts to national income"

there's four parts to national income when we measure it this way:

1) wages,

2) profits,

3) self-employed income,

4) subsidies to consumption and

5) taxes on consumption.

Well, I guess that's why economics doesn't work.

5
0
Bronze badge

Re: "there's four parts to national income"

Your four and five there are regarded as one item:

4) subsidies to consumption minus taxes on consumption.

The net figure is our fourth.

1
0
Silver badge

Re: "there's four parts to national income"

Thanks Tim. I did suspect that was what you meant but that;s not how I read it :-)

PS keep up the good work. I might not always agree but it's always entertaining.

1
0
Silver badge

More not very helpful economics-speak

"There was a vast amount of money wasted in dealing with Y2K, then we splurged way too much on dotcoms and it's hardly surprising that we're investing less in the wake of one of the largest recessions of modern times".

What's the matter, don't you believe in free-enterprise capitalism and the magical powers of *** THE MARKET ***?

As for "hedonic adjustment", isn't that the clever scam by which the US government replaced the price of beefsteak with that of "hamburger" (cheapo mince to you and me) to disguise the rising cost of food? It doesn't fool anyone.

0
0
Silver badge

Re: More not very helpful economics-speak

"hedonic adjustment" is how you hide the $100 rise in food costs by including a 40" flat screen TV in the "basket of goods and services" and showing that it has dropped from $1000 to $300 in the last 5 years and therefore inflation is under control.

3
0
Bronze badge

What do beancounters know?

Well, while they may not be technology experts, they can still know the same things that everyone else knows. And it is a fact that over the last several years, computers haven't been improving as fast as they used to, which is why we have quad-core chips but not 10 GHz chips in our desktop computers.

Although it looks like that's set to go to six-core and eight-core chips thanks to Intel's latest generation.

And the excitement and sales have been in mobile devices, not desktop PCs; that, too, has received lots of publicity.

0
0
Anonymous Coward

Looking to my own customers' experience

In recent years, we've sold plenty of upgrades, but the larger customers have been after footprint reduction above all else. As such, I see two things happening:

- Server sales down, as each individual box does more

- Server makers looking for alternative revenue models (e.g. back to licensed capacity - something my line of business knows a great deal about already).

The sector I work in is moving inexorably towards a point where businesses, like the one I work for, no longer provide hardware, instead deploying software solutions into resilient server farms (which may or may not be the property of any given customer). This is fine by me. Certain Chinese competitors can and do offer to replace us for free (and they're big, so the business has to pick its fights carefully); they can do this because they make the hardware, whereas we have to buy our servers from the likes of Oracle and HP. As Ross Perot determined, smartly run software and professional services is an easier game survive.

Structural or cyclical? Intuitively the change seems structural, but I'm not convinced I understand the two words (economically) well enough to call it 8-).

1
0
Bronze badge

Has IT spending gone down? Good question.

There is probably been a shift from purchasing things that tax law requires be depreciated over a number of years to buying things that are business simple expenses.

Perhaps there were changes in depreciation laws in some countries too.

But also a lot more is leased or run off-site, or run in a cloud.

Software is more likely to be developed by external vendors rather than in-house.

And some companies may have shifted responsibility for some purchases out of IT departments, for example cell phones, memory sticks, program packages not widely used, etc.

I can't even begin to guess whether over all expenditure on IT type things has gone up or down. I would say it is going to vary company to company, industry to industry, and tax jurisdiction to tax jurisdiction.

0
0

Don't forget what constitutes capital equipment

I suspect that my workplace is not unique, in raising what is considered capital equipment cost thresholds, from $500 back when, to $5000 these days.

A PC (and many other tangible items) used to be capital equipment, but not nowadays.

2
0
Silver badge

Sounds familiar

"Computers in the future will weigh no more than 1.5 tons."

- Popular Mechanics, 1949

"There is no reason for any individual to have a computer in his home."

- Ken Olsen, co-founder of Digital Equipment Corporation, 1977

"640kB should be enough for anyone."

- Bill Gates, 1981

"I have travelled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won't last out the year."

- Editor in charge of business books, Prentice Hall, 1957

"Transmission of documents via telephone wires is possible in principle, but the apparatus required is so expensive that it will never become a practical proposition."

- Dennis Gabor, 1962

"It would appear that we have reached the limits of what it's possible to achieve with computer technology, although one should be careful with such statements, as they tend to sound pretty silly in five years."

- John von Neumann, 1949

"I think there is a world market for maybe five computers."

- Thomas J Watson, President of IBM, 1943

Thanks to TechRadar

0
0

In the dark ages

When dinosaurs ruled the earth, most companies couldn't afford computers or the tech priests who ran them.

So they went to what (in this part of the world) they called "consultancies" who would run a couple of big iron dinosaurs & a flock of smaller "mini-computers" (A minicomputer fitted into a single rack, disks were a separate rack) you could buy time of the gear to run payroll, or stock control or whatever, the consultancy operation would be a one stop shop for all your computing needs.

Then as the costs came down and tech priests were more abundant, more & more companies started to run their own gear. They especially liked the fact that the consultancies would no longer have them by the short & curly's ... they would be masters of their own destinies :)

The PC revolution (as stated earlier in the thread) then the networked PC fad of the 90's meant the death of the consultancy model, queue a round or two of in-sourcing & outsourcing and now the trend is cloud computing & managed services, which I'm having a hard time differentiating between and the older "consultancy" model (other than the truly VAST scale of these operations)

So the answer to your cyclical change/structural change question is yes, it's both

0
0

Out of steam? There is now stuff called electricity!

This is only a pause... we are edging (pace Mr. Honeyball) in to 64 bit. In a big while we may get optical computing, but who will then buy into it at that cost? And then there is quantum computing - if memory serves El Reg ran a piece on that only two or three days ago...

It is, I know, not good to push arguments by using analogies (i.e. I'm about to do it) but we're past the Kittihawk stage and probably Bleriot's trans-English Channel flight, the old Vimy has crossed the Atlantic but everyone knows that aeroplanes that suck in cold air at the front and squirt out hot air at the back simply cannot work. As we say in Glasgow, "Eh, no?"

0
0

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon