I would have thought that the problem lay in machine interconnectivity rather than age.
In a machining environment, or any other that is isolated from other computers, a system that uses XP or NT - or even 98, to run specialist software on specific machines that are dedicated to that task and working successfully, there is no logical reason to upgrade everything merely to continue what they were doing anyway.
Stand-alone machines can run whatever software they want without risk.
It makes me wonder whether malware isn't just the best way software companies have come up with so far to 'encourage' users to spend a lot of money merely to stand still.
As far as the NHS is concerned, I have been treated perfectly adequately by machine-led medical intervention that was still using NT. In that I can understand their reluctance to upgrade, particularly when all that money is spent in re-equipping machines, software and re-training simply to continue doing what it is already successfully doing when that money could better be spent of extending its scope and capacity.
Maybe the government should be employing expertise to write its own OS that works in its own field and eliminating its reliance on commercial software, which by definition is profit-driven.
Being absolutely honest, outside of the computer industry, nobody really gives a damn about all the technological advances, bells and whistles new stuff does so long as it does what it was procured for, and that is most likely for a quite narrow range of tasks - that demonstrably XP is quite capable of performing.
If Linux development can be done for free distribution, is it such a leap to use specific software for their single purpose applications a lot cheaper than relying on perpetual upgrades to nowhere?
Who'd be writing malware for that?