I have in the past commented that asking by questions of ethnicity for purposes of tracking I&D proportions is itself contributing to the existence of racial lines. Some companies now do not expose names on cvs to recruiting managers to avoid the inherent bias we have hard coded in our DNA and culture towards prejudice. We like to think ourselves as an evolved species but from any rational standpoint that such measures are necessary says a lot about how far we have to go.
51 posts • joined 7 Oct 2016
Who is the target market?
Remind me again why anyone needs a workstation-laptop? A tablet for general net surfing and email + proper, upgradeable tower on the side is a far cheaper more powerful, more reliable solution all round.
But hey I'm not MS' target market anymore whether they choose to peddle Op Systems or Hardware. Quite who IS the market is beyond me.
I'll carry on being a luddite.
I'll upgrade when I need to, thanks
My 4 year old mid range card still does everything I want quite nicely thanks. If there were such a thing as a mid range priced card now I might consider upgrading it. The £100-200 card market however is all but nonexistent. Crypto is an easy blame shifter but one must also consider if products people want are on sale. £400gpu? Not necessary. And a £50 one is usually not an upgrade. There is also more competition - skylake onboard graphics are surprisingly good for the cost, so bottom end cards are even more pointless. A serious analyst will acknowledge crypto but it's not the only story in the graphics world.
Intention = Good. Implementation = Terrible.
Freedom of speech is not something Brits have, and further clamping laws around what can one even VIEW let alone say is an affront to democracy. Sure, there is no shortage of hate speech to go around, but at least the US have the RIGHT to take express a viewpoint. Whether one considers it right or wrong to do so is not a criminal matter, but one for society to judge.
The idea of PC-gone mad is not lost on me. Who gets to decide what content constitutes an offence? What if you are lured into viewing illicit material unintentionally via a malformed link, or perhaps malware or otherwise? Assuming that is the case, how does one prove or disprove whether one intentionally opened something to view it?
Once again the intention of Brit lawyers is probably well founded, but the implementation is a joke and probably grants powers far and wide beyond their intended role. Like legislation has never been mis-used before...
Cue more letters to your local MP. Too bad they are too busy debating something else (rather than doing their job and debating / improving legislation...)
Trolling in the Reg's forums... we mean, er, 'working' on the train still rubbish thanks to patchy data coverage
...Define productivity?! Not posting comments on El Reg? :-)
HS2's main benefit for me is that it'll take traffic off the older line from Marlylebone to Birmingham. Might even be able to get a seat at peak times. Ha!
If I happen to be able to get wireless or 4G in work time, that's a bonus, but I would never count on either to be able to do anything other than maybe a bit of proofreading.
Trying to log into Office 365 right now? It's a coin flip, says Microsoft: Service goes TITSUP as Azure portal wobbles
...And Counting. It's only January. If you work in financial years it's obviously worse again.
What does it take to persuade the global procurement brigade that there's something better and cheaper already out there? You could point a gun at the heads of those responsible and not get them to budge. No doubt there are back handers everywhere to persuade them to stay.
On the subject of backhanders, why do so many big companies insist on you using their own, outsourced travel ticketing system, even though it is more expensive than booking yourself? Clearly there must also be backhanders and shenanigans going on there too. I can't come up with many other explanations, perhaps, it's some sort of job creation scheme?
The corporate world has driven me quite mad and I think I shall be looking for a small business instead for my next move.
Not so long ago, Mainframe-to-XP migration
I worked for a while at HBOS subsidiary. Even as late as 2004 there were significant chunks of call centre operations running on dumb terminals. Some of the older staff members had never touched a GUI. As part of the XP migration I absolutely insisted that Solitaire be retained as a training tool. It was, after all, a call centre and absolutely everyone could see you playing it while manning the phones.
I would argue it probably improved productivity. There are few more soul destroying jobs than answering calls from incompetent mortgage brokers asking "where's my mortgage offer" to reply with "you haven't sent in document X" fully in the knowledge that the broker was walking off with their 1% commission, while you were on the phones barely on minimum wage.
Watson is just a brand name attached to tools for applied statistics. Sure, hardware and datasets have got bigger and faster opening up a few new opportunities, but practical value continues to elude most of the more complex applications. How much money has your corporation blown chasing Big Data for much promised but little delivered returns? With R or umpteen other open source tools readily available why would you pay a ton for licensing plus staff, when one can pay just for the staff to get same results? Python/Numpy/SciPy are popular for a reason!
The last 10-20 years of R&D at IBM has taken them nowhere. PPC used to be a moderately successful line of business. Mac G4 & 5, PlayStation 3. Power9 is a fantastic architecture, but sales are weak and takers are few. A deal akin to how X86 exploded in the 1980's might be what's needed to make it competitive. A Power9 desktop at sensible price (say £1000) would have me jump like a shot off X86. Can't help but also think there's a market for the well priced bedroom computer too. We all love our Atari's and Amigas!
IBM Mainframe persists in a few environments. Their desktop tech has almost entirely been sidelined.
I would suggest failures to market Applied Statistics are now forcing IBM to use their buying power to find a few (guaranteed) revenue streams instead. RedHat, probably amongst others to come.
Stagnating performance = no rush to upgrade
This is all not very surprising - changes in PC specifications are well into marginal gains territory now; so other than workstation or gaming markets there's no rush to upgrade. Whether I have a 6 or an 8TB HDD has absolutely no impact on when I choose to upgrade; my only worry is when will device X break. Essentially the only difference between a average desktop today, and one of 5 years ago is a bit of reduced power consumption. A 5 year old machine is good enough for most tasks; so turnover is down other than for asset replacements.
Cloud storage is starting to appear on even the most staid and backward corporate IT. Local storage is increasingly irrelevant, and indeed as offices increasingly ban USB storage devices due to security risks (cue reduction in memory stick sales), cloud is becoming a necessity. As Win7 approaches end of life, and corporates roll out increasingly more invasive and CPU-hungry internal spyware (Tanium client, anyone?) demand has definitely shunted over to performance over space.
Incidentally, our office had a load of perfectly good Lenovo T430's and T440's with spinning rust drives. Their performance was acceptable, up until the rollout of tanium. While the base hardware is basically good; the whole fleet are now being changed to T470's with SSD. Win10 is apparently due on the desktop later this year. Hope they got the hardware spec right for it to allow for MSSlurp plus all the corporate crapware!
Huawei's 7nm 64-core Arm server brain, fresh Intel desktop Core chips, IBM tapping Samsung for Power10, and more
Re: autonomous cars
On the grounds many Western legal systems are founded on case law, letting a few things crash and piling up a few lawsuits is how one changes the law round here. Just like the H&S disaster we all know is waiting to happen, but do nothing about until it happens.
The ideas of the Napoleonic Code are far from perfect, but it at least puts the horse in front of the cart...
Swap files eating my disk
With machines having so much RAM to play with today you have to wonder how much scratch space is really needed on disk outside of a big database, video editor or DAW. As annoying as losing 10% of your 250MB disk to swap space was on Win3.1, there was at least justified reason for doing so!
Then again we are also in an era where a PDF reader ranges from under a meg up to hundreds of megs. I'll take the former, thanks.
Windows 10 was promised to be the last version of Windows - even though it has a stated end of life. Might as well start making moves towards it's successor: My crystal ball says MS buyout of Canonical. Reserved partition == /tmp ?
IBM have just gobbled Red Hat. MS will be after Canonical to keep up. Never say never!
Evolve or die.
There is not a gamer on the planet that wants a fixed disk for hosting software. Data dump, yes, games, no. Evolve or die. £/GB of flash is really not that far behind that of spinning rust any more.
Toshiba is already going down the pan, dragged heavily by it's 1970's subsidiary Westinghouse that didn't evolve either. One does not have to be a genius to see that the board are clueless on how to save their sinking ship.
The lack of willingness to try out new material does my nut in. There are thousands upon thousands of novels worthy of a big screen makeover, but no, let's rehash something we know yet again. Harry Turtledoves world war series,for example, I have said since it began that it was practically a screenplay already, one ripe for a blockbuster
I can assure you from first hand experience that certain rather large banks are still running NT4 on ATM's let alone XP. OS/2 was prevalent until very recently!
But yes, the real issue is access in the first place. Prevent physical access and 99% of these attack vectors go away. The exception, and the far more plausible one, would be for someone with access other areas of a banks network to direct attacks to the ATM.
The inside job, supported by insider info is far more plausible a threat.
Marketing materials have echo
This isn't the first time I've seen an analytics outfit completely echo the screams of businesses unable to deploy big data tools... KNIME comes to mind amongst a bunch of others. The sales hook is solid. But once again the real problems with organisations and mantronic processes are forgotten. Segregated databases designed in isolation, different naming schema, dissimilar but associated record keeping. "Fixing" those problems means wholesale system change on a large scale - not going to happen, sorry. Cost-benefit doesn't stack up. Come up with tools to let me plug those disparate worlds together and then analytics could come to life.
In practice I've found the most powerful technique is dumping contents of various enterprise DBs into SQLLite then gluing them together for hacking in Python and Numpy. Unless you can reduce the skills requirements needed to do what I've just described, then the mantronic solution is going to win every time.
System Administrators ruled by Accountants
The only appeal I can see to WSL for anyone, those users sat behind businesses with terribly unpermissive IT; to the point that they can't get a VM or a blank machine to install distro of choice. That feels like a pretty limited user base if you ask me.
Having failed to destroy Linux on reputation, MS are now adopting their classic Embrace and Assimilate strategy. Anyone with the will and need to learn Linux will absolutely prefer the real thing. Why continue to support the monster when you don't have to?
(I'll let you off if you still play games on Windows, that's all it's good for, and then, Win7 is still my preference!)
Persistent DIMM's - Persistent Malware
Forgive my ignorance but last time I commented on persistent memory I was downvoted horrendously.
Building super-fast persistent via a memory interface may be a recipe for speed; but it is also a recipe for hitherto unseen security disaster.
I'll be quite happy to let the technology evolve for 3-4 years, the inevitable chips to fall, and then, if at that point it proves reliable to start considering it. I would also hazard a guess that investment in better software and data structures would give a far better return than throwing inefficient processes at increasingly power-hungry hardware.
Still in service... in both industry and with retro gaming!
Alarmingly, I am aware of several high value commodity metering installations still running instances of Win98. Presumably their uptime is not regarded as terribly important?
In fairness to 98SE; it was arguably the most compatible OS for gaming purposes in it's day. I alternated between 98SE and 2K for games and productivity. While the former was a pain to maintain, there was not a title in the land I could not get running on it. It's often the only thing that will manage early directX titles. Now I'll admit, there's not many of those are worth returning to, but try get Interstate '76 running on anything else? Maybe one or two other standout titles.
I recall an interview on GOG commenting on the those early directX and 3DFX titles by far being the most difficult ones to get running on a modern system. Virtualisation isn't exactly well supported for in these niche environments!
Re: haters gonna hate
I can't see how anyone can defend them anymore. Planned obsolescence, slurp, mission creep, bloat... Ironically current releases of visual studio are probably Microsofts best efforts but one can not simply ignore the downright criminal history of their organisation. Aardcode is the most famous and likely the tip of the iceberg of malpractice. When ms bought linked in, my details were removed immediately. Git will have the same fate. Regrettable because both can be useful tools. I encourage all to move on. The grass may not be as tall but it is green!
Garbage in, very definitely garbage out
In my experience, collecting data about the physical world is rather more difficult than collecting data on the flows of money, or trending my film watching preferences. To use an example - how does one collect data about the physical condition of your server racks? Are cables routed tidily? Is the structure suffering from rust? Or, using a more pertinent example, how do you quantify the condition of a hundred year old bridge? What about the inaccessible, metal reinforcement inside concrete structures found virtually everywhere. Unlike Roman concrete made with seashells & volcanic ash, modern rubbish does not improve with age!
Such things tend to be expressed in terms of probabilities of reaching a certain age. The objective historically, was to undertake pre-emptive replacement. The cost challenge comes, quite rightly, from "if it's been OK for 50 years, won't it be OK for 10 more?" But how do you prove it? Other than by taking a punt on it and praying you'll be retired before it comes home to roost?
Magic AI boxes and analytics teams are employed to generate outputs in multiple industries. The outputs of those processes have to be calibrated. Low and behold; calibration exercises are usually against those original probabilistic models; especially when there is an absence of evidence!
From an engineering standpoint, I find the whole business of AI taking over the the decision making functions a depressing distraction from the shortage of people understanding form and function. If your model does not know about a failure mode, it cannot predict or react to it. Even more depressing, are those oblivious to the fact that our aged infrastructure, no matter how well made or maintained, has a finite life. Attempts to fiddle with life extensions here and there ultimately still lead to asset replacement, and in the meantime you continue accumulate additional risk and maintenance cost in trying to eke every last bit out of the original.
Simply put, it would be cheaper for all concerned to simply get on with it and put the replacement up!
Russians poised to fire intercontinental ballistic missile... into space with Sentinel-3 sat on board
Re: Launch partner?
Converted ICBM's otherwise headed for the chop shop are relatively cheap. It's probably also cheaper for Russia to hand them to Roskosmos, convert the payload & launch, than leave them to the military to decommission.
I don't recall if it was the SS-19 specifically, but there are some heart stopping videos of Russian ICBM launch procedures; and how they clear the silo. Rather than fire the main engine in the silo, a brief solid fuelled pop puts the rocket above ground; leaving the rocket to briefly fall back while the main motor engages.
This really is no surprise at all. ARM boards for servers have been in circulation for some time now. This is assuming it's an ARM of course. The latest incarnations of PPC are pretty tasty too. Can we expect an iOS and OSX merger to follow? Have Apple learned from MS the folly of putting a mobile OS onto the desktop?
The twisted part of me is also thinking RaspberryPi OSX is probably not far behind either.
The Windows 8 debacle, and loss of control over software in 10 meant I, like many, many others didn't bother upgrading from 7.
We were promised Windows 10 would be the last Windows ever, and with MS growing closer to Canonical (and others) by the day; it would not at all surprise me to see a buy out on the cards as 10 starts enters it's extended support phase.
I've already made the jump. I never purchase don't get bought. Windows only games that I still want, the devs get a nice mail asking for a port please. If enough follow suit the devs will soon follow the money.
This is Apple just doing their thing, playing with the equation "Number of Units Shifted * Profit per unit". They've got away with blue murder raising prices, and one suspects there's some overpaid academic somewhere that wanted to try out something "ludicrously priced" just to see what happens. If 33% haven't upgraded, that's still 66% that still HAVE. The equation obviously still balances in their favour, or their balance sheet would be in trouble.
It's been far too long since I've seen desirable Apple hardware, apart from their laptops maybe. OS X; which of course isn't sold as a standalone, if licensed for reasonable amounts (like, say, Win10 Pro pricing - or even a bit higher than that) we'd all snap their arms off to swap. The argument that it would affect Apple's hardware margins to do so doesn't hold so much anymore what with Mac sales generally though the floor (and yes, we still like the laptops!)
Late to the party
I started off on the Spectrum 48K, which thankfully had a keyboard better than a calculator. The manual, to this day, is still a great primer on BASIC programming. Not a great primer on basic programming, of course. I can hear the academics turning in their graves over the use of GOTO rather than functions!
That machine got lemonaded while meleeing over a joystick; so we went backwards to a Sharp MZ80 for a couple of Christmasses, then leapt forwards with a C64. To this day I'm grateful to my Dad for putting us through the iterations as it's ultimately been my ticket to a job ever since! I'll be damned if I could still remember how to use the MZ80, but all the other machines I can still bodge my way through.
I recently returned to the C64, fully re-capping and heatsinking the whole system and the excellent Epyx Fastloader Reloaded cartridge with an SD card reader. You forget how hard programming is without a proper text editor or variable names longer than two characters. The C64 programmers reference is hilarious. The complete opposite of modern editing practise. I don't know, encouraging the use of super-condensed code, avoiding comments, minimal spacing, condensing multiple functions into single line numbers and using near-unreadable short symbolic codes for the BASIC keywords.... The kids don't know how easy they have it with Python and virtually unlimited resources!
It even encourages you to jump into assembler for maximum efficiency. Hard to find assembler being actively used now, other than for a handful of specialised operating system functions... And even then most of the op system you are reading this on was probably done in C or C++.
Bang on. I write and use programs based on a particular industrial standard that has errors and inconsistencies with the use of scientific notation. Changing the standard takes years of course, whereas doing the necessary fix my side is trivial. Who is responsible if the program calculates something wrong? It Might be a case of garbage input. Or hardware error. Or even 3rd party interference. Nobody should be so hasty as to assign blame for the devil is in the detail.
Re: Potentially good phone, ruined by a dreadful company
I had this same issue. The Swift 2 doesn't charge from flat, from a USB socket. Needs a mains to USB adapter with a bit more kick. Agree though, the support elements are (were?) dreadful. Mate of mine sent a phone back to get the screen replaced. The only record of the RMA were held on Twitter ffs!
However at 50 quid a phone I am largely of the opinion they were expendable devices, not worth the hassle of repairing even.
I'd argue the Swift2 was probably one of the best phones around today off the shelf. The (mostly) no crapware policy was a welcome change to any of the alternatives. The price point was excellent too.
Honestly can't think of any plus points for the otherwise overkill $800 handsets that have become dominant. I'd take an old Nokia 6150 or similar in preference to a phone more expensive than my PC!
So, when you spill coffee on your laptop trackpad; the laptop has to be sent off to be fixed.
When you inevitably get a dodgy keyboard from eating sandwiches at your desk (due to the lack of time to take a lunch hour), the laptop has to be sent off, rather than fixed on site.
When you need the OS image re-installing because of issue x? One sends off the laptop?
When you need a software install that isn't something in the pre-approved catalogue... You send off the laptop? Or just demand administrative privileges and do it yourself...?
I think I might retrain as a courier. It's looking like there will be an awful lot of work in that vein in the near future. IT support was always about more than just remote database administration, was it not?
Also, where you say cloud, I say mainframe client. The 1960's default architecture has simply been rebranded. Such architecture have their place, but it is by no means a panacea; and especially not todays desktop "thin clients" suffering from dependency hell. Obsolete Java and Oracle clients, DLL's, IE6/7/8/9/10/11, Node.js... I could list others. Somehow, those wonderful text-only mainframe terminals look rather more appealing to maintain than a Windows n terminal onto a "cloud" remote service.
I still maintain Fortran IV code in production, and am all the more thankful for having control. God help anyone coming after me having to learn what it does. Amongst other things it emulates the input of a punch card, using a text file as a proxy. Good luck finding any offshore provider that would be able to provide any kind of support for that!
Half the problem in my experience is capturing something worth analysing in the first place. Defect data is rather useless if one category field captures data on hand-wash bottles to fan control system failures!
The second problem is the enormous legacy of non-digitised records. Drawings, microfilm, Mk. 1. Contents-of-Brainssss. Sort those elements out and big data can start to pay off.
In the meantime though it's just a marketing gimmick to employ people while not solving the underlying problems...
Every car plugged in adds demand. That demand wasn’t there before and is therefore not being served by renewables, which by definition should have been running already, if possible. This means the only leccy going in your plug in is either gas or coal derived, the remaining marginal generator and only thing left to fuel your car.
Add in transmission and distribution losses, the latter of which are going to skyrocket due to distribution networks not being sized to service large volumes of leccy cars, and you have a problem needing a lot of investment to solve.
Energy policy is an absolute farce and the idea that the market will deliver is tosh. The market is geared for old school big, predictable generators and dull demand profiles. Severe change needed at all levels to avoid blackouts. Maybe we should be getting used to the idea of sacrificing availability...
carbon copy of engineering.
We have the same problems in engineering. Girls consistently do better early on in maths but cultural limitations do their damnedest to ensure they rarely take up a levels or beyond. I cannot help but think the current generation of headless games consoles and tablet computing do even less for self education than the mega drive era did. Microsoft and others are in for a very rude awakening when the critical mass of skills in circulation falls to a point where it's impossible to climb the learning curve all again. I would argue we are starting to see that now with black box solutions rather than understanding being the norm.
Re: Oh noes
Nominally I would agree, however I ran a migration of legacy Z/OS keyboard driven dumb terminals onto NT4 as late as 2005. Besides everything that is wrong with that statement, a significant number of older employees had never picked up a mouse before; and that played a major part in persuading management that it was worth including Solitaire in the build to teach basic mouse skills.
If SPARC hardware wasn't hopelessly overpriced it would sell better. I appreciate the quality of the product but against a backdrop of alternatives that are "good enough" often the only reason for the architecture to persist is legacy apps deemed too difficult to move or specific cases where you want alternative architectures in the mix. For everything else, why wouldn't you dump it onto X86-64, or maybe at a push, ARM?
I still have a tiny handful of apps on UltraSPARC II. After 20 years service it just isn't cost effective to keep it on the there anymore. They have literally just a few weeks left. Even if that means replacing an x86 server every 5 years; which is probably an under-estimate of lifetime.
Hell I still have a socket 604 xeon that works quite happily; albeit it's a bit power hungry compared to a Raspberry Pi for the same kinds of mucking-about workloads that the two see!
I surely can't be the only one that thinks that the cryptocurrency fad is a colossal waste of fossil fuel... I mean it's marginal demand that otherwise wouldn't have been there. And the marginal generators of choice are almost always coal or gas! Surely a more important stat would be how many fps you get in counterstrike or quake 3?
That's AV ruined, then.
Ahh, persistent memory, another government sponsored feature perhaps? Malware can sit in RAM surviving undetected through power cycles, or even moving from one motherboard to another, independent of the operating system. Your boot CD or recovery tool of choice will no doubt be oblivious to well thought out malware disguising itself on boot. Don't mention the possibilities for interfering with common software installers. Yet another feature we don't need with enormous risks not properly thought through.
He has a point in believing coal gen is their only major source of synchronicity. Coal and Nukes are the European traditional sources, both of which are generally being scaled back despite horrors like the Hinckley point project. Oz to the best of my knowledge hasn't got a lot of gas or Nuke leaving little alternative.
Gas turbines offer some inertia but diddly squat compared with a coal guzzler like Ratcliffe or Drax.
Small scale inverters on home solar are mostly programmed to match the local frequency. If that happens to fall outside of acceptable limits, they are strangely enough programmed to stop generating. That's a sensible thing for one unit to do - but the 11+GW installed base of solar in the UK potentially tripping off at the same time on a frequency issue? You need a whole LOT of Dinorwigs to compensate for that. As in, about 7 more of them. I don't see that being planned any time soon.
Now, add two and two together and you'll realise that some badly timed generator failures while the solar farms are busy could lead to a cascading fault scenario.
Energy plan? Our market-driven approach couldn't be more clueless if they tried. Too interested in satisfying pensioner's share payouts. Such a shame we lost David Mackay who attemped to bring rational debate to energy.
Bring back the CEGB. All is forgiven.
I attended another IBM event last year at Hursley where they demoed Watson to answer questions about the history of the building.
Even given such a limited set of source data, which consisted mostly of one guidebook only; asking rudimentary questions like "When was the building completed?" were non-functional.
Now, despite this shining recommendation I am aware of other Watson projects being trialled to look at, for example, technical standards a whole lot more diverse and poorly referenced than a guidebook with contents and index pages. Can't wait to see how dreadful the results are.
Incidentally the same demo session also included IBM attempting sell ideas back to us that, not only did we tell them about, but already have in production.
Big Blue has some fantastic technical experts, but in my experience those that can are being poached off elsewhere leaving a shell of a company and a lot of managers that talk the talk but haven't the techie expertise anymore.
Ancient Xeons - still useful and cheap
I cobbled together some upgrades to a Dell Precision 530 workstation that was destined for the scrapheap. The final revision of the motherboard, from 2001 (!), plus a pair of 3GHz Socket 604 Xeons. The HDD's are a bunch of 15k RPM SCSI monsters, and video upgraded to the best AGP Radeon card I could find. This was all topped off with some riser cards and 4GB of RAMBUS RIMM's.
The parts for this lot came from ebay and scavenging, and cost less than £40.
Despite the age it still runs Win7 32bit every bit as snappily as my i7-6700k. Obviously you can forget games from this decade but older stuff still behaves. And it's more than good enough for internet/office type tasks. Yes, it's a bit power hungry, and yes, a bit noisy, but that's besides the point.
So, against that backdrop why on earth would anyone without an interest in cutting edge software consider upgrading? The only reason is hardware failure. I should also point out that I have replaced friends & family PC's with either Macs or custom build Win7 towers in preference to letting anyone suffer the horrors of zero control in Win10.
I also cite the change of the world from purchasing licensed software to subscription as a major factor in dropping PC support. Adobe, MS, even AutoDESK have all gone this road and none of us want it. I mean, why bother upgrading when you can get a permanent copy of Office 2010 that does everything you need and then some for not a lot. And it's a damn sight nicer than permanent subscriptions just to use MY damn computer.