Microsoft plans on delivering supercomputing power to a broader audience of scientists via its cloud computing and server technologies. On Tuesday, the company announced its new Technical Computing group, saying it will let scientists focus on research without having to build or program complicated applications or server systems …
"New" supercomputer group?
When in doubt, reorganise. After all, it worked for DEC, didn't it.
What were MS's HPC folk up to back in 2006 (or was it 2003 - the answer is in the article, for the Sun readers amongs us)?
"Microsoft has just chosen the UK for the worldwide launch of Windows Compute Cluster Server 2003, its HPC (High Performance Computing) offering.
This is because high performance and technical computing is a growth market in the UK and a lot of the basic work on it was done at places like Professor Simon Cox's Microsoft HPC Institute at Southampton University."
It was a big hit in the HPC marketplace, was it not?
What, even less succesful than Vista? Howcome? Just because Windows hasn't a clue about high performance low latency software (preferably with SMP or multinode apps) doesn't mean it can't produce ever such pretty charts you know. As long as you've not used fixed point (scaled) integers for performance, that is. Excel rather struggles with those.
'Fluff' is the key word here
The website referenced in the article contained no substantive information, just a few fluffy stories ('eradicate malaria!, simplify rocket science! more efficiently screw policyholders!'), and an unsuccessful attempt to install Silverlight. The 'please visit www.modelingtheworld.com for your ideas and feedback' yielded a black page which again tried unsuccessfully to install Silverlight.
The technical computing boffins *I* work with use Linux -- coz that's where the affordable supercomputing power is right now. To a person, they're also turned off by marketing hype, so Microsoft is unlikely to impress any of them with glossy websites that require unsupported plugins.
How sweet, MS have 'goals' and 'aims' to solve problems in the scietific community. It doesn't mean they're going to achieve them. I think MS doesn't have the first clue what is actually required, but they are talking it up as much as they can.
Meanwhile, the real solutions providers will continue to provide the to scientific community in an open and engaged manner. For example IBM has been trying to solve exactly these problems through their PERCS project and this has been going on for years with the aim to deliver this year.
MS is desperate to move out of their desktop/small server space, but this ain't it.
Move along MS, nothing to see.
Windows HPC has only five machines in the Top500
And of those five, two run Linux (#73 at NCSA also runs RHEL4, #106 at UMEA University runs CentOS). Those are more properly "Mixed" or "WINO" (Windows In Name Only).
So really, only 3 Windows machines in all of the top 500 supercomputers on the planet - less than one percent. I wonder if any of the others have upgraded to Linux yet? Must write some emails and ask. The other three are #94 HWW/Universitaet Stuttgart, #19 Shanghai Supercomputer Center and #74 University of Southampton.
They had better order up more boffin fluffers. At this rate Windows is going to be lumped in with the other niche HPC solutions. Perhaps it should have been already.
3? Lower, lower!
Officially Shanghai's massive system is a Windows machine - because their benchmarks were done under Windows - and were good enough to get in the top 10 with its benchmark.
However the majority of the time it runs Linux because that's where the real apps and the real users are.
So, not 3, but 2, or fewer!
You ARE having a laugh, no?
Even if MS COULD write something that wasn't stupidly resource hungry, was reliable (snigger), didn't crash (fnaaar), and worked (yea, right), are they forgetting that the market they are aiming at (research, education) generally don't have the stacks of money to spend on such stuff. Why would a research program spend gobs of money on MS software what won't work anyway, will require twice the resources of other solutions, and that uses proprietary and non-portable techniques when they can use something like Linux (for example) for free (which also happens to already have a strong clustering and supercomputing base)? And get portability thrown-in, which is important in the non-commercial world because then you can share stuff with other researches and swap ideas.
MS just don't "get it", do they?
Microsoft re-launches HPC
The only area, (I can think of), where Windows HPC cluster makes sense, is in mechanical CAD. where the engineer starts on a Winodws box, with a big graphics card, and then run sout of power. He has no interest in learning a new OS, (at any level suh as linux), as he is an engineer, and so HPC cluster for him makes sense. A better solutuion would be a Windows scalable SMP, but they don't exist YET!
to "automate and simplify writing software through parallel processing"
I look forward to their timemachine program so I can go back 20 years (from now not when MS make a time machine) so I can wait expectantly for MS to innovate the things I played with then.
Top 500: 74th place So'ton is *Microsoft* Institute for HPC
"only 3 Windows machines in all of the top 500 supercomputers on the planet "
Thank you, good to have it confirmed. One thing not mentioned re the Southampton machine in 74th place: it's a paid-for place (as doubtless are a handful of the 500 in total).
From El Reg's 2006 MSHPC article: "Professor Simon Cox's Microsoft HPC Institute at Southampton University.". Spot the significant word there?
See also http://www.soton.ac.uk/ses/research/mshpci/index.html "Microsoft Institute for High Performance Computing". "Staff from the Southampton Institute for High Performance Computing demonstrated their service-oriented Tablet PC-based photonic crystal design system on the Microsoft stand at Supercomputing 2005 in Seattle." Wtf? "Service-oriented"? "Tablet-based"? Nothing notable since the (MS-funded?) opening publicity at SC2005?
What had the academic world come to in Bliar's Britain? Education education education? No, dosh dosh dosh.
Begone, forever, Blair and all your pseudo-Tory (and real Tory) acolytes.
As a user of baby HPC ...
.. I honestly can't imagine why I would want to use Windows for this. People who need HPC are generally happy with Unix-like environments. I'd rather save the license money and buy more silicon than stretch a wasteful point-and-drool 'familiar' UI around a real machine.
Windows on HPC is like faulty air conditioning in a racing car.
MS Institutes for HPC
As mentioned by Mikel at 08:06 there are three MS sites in the HPC top 500: Southampton, Stuttgart and Shanghai.
There are ten or so MS-sponsored MS Institutes for HPC around the world. Southampton is one, and Stuttgart and Shanghai are among the others, according to MS HPC Institute programme's blog (one entry since 2005?): http://msinst-home.spaces.live.com/
So the other seven or so MS-funded HPC Institutes don't even make the top 500. Better spend a truckload more money, Bill, or your irrelevance as regards HPC will become even more obvious than it already is.
Lead, follow, or get out of the way, Bill.
Purchasing decisions made by techies for techies?
Would it be fair to say that one reason MS are largely irrelevant in this sector is that purchasing decisions are still largely made by techies for techies based on fitness for purpose, and that pointy-headed bosses, MS-dependent consultant/reseller shills, and other such scum generally don't have the same kind of influence over the procurement process as they have been allowed to obtain in the rest of the IT world? Is there a lesson to be learned here?
Comments have missed the point
Microsoft are suggesting that they can apply cloud techniques to HPC problems, which makes it new and different.
It also illustrates that they have absolutely no idea about how large HPC shops work.
Not only is the number of processors important (which cloud can address), but I/O performance and homogeneous data access is also required for most large problems. This requires significant I/O power, and localization of the data to the processors. Cloud at this time cannot provide this, unless the dataset can be compartmentalized and shipped around the cloud. Using traditional decomposition techniques also requires processors dealing with their part of the problem to exchange data very rapidly (talking about microsecond latencies here) with the processors handling the adjacent cells, also a problem if the compute service is geographically distributed.
About the only way you can achieve this using cloud techniques is the way that SETI and the other collaborative projects work by breaking the work down into discrete chunks, which is not suited to large problems like climate, nuclear blast or materials modeling.
But in theory, there is no reason why Windows can't handle this in the future with the correct tuning, but you have to ask, why would anyone bother? Unless you're Microsoft, and can't bear there being a market that you are unable to dominate!
...that sounds ugly and patent-encumbered.
HPC & Microsoft
Racehorse, meet milkfloat
Memo to the heathens
As the world dumbs down, the intersection of hyperbole and scientific reality will grow to encompass the entire domain with only a sliver of each remaining like vanishing silver linings of the overcoming dark storm cloud that is MS HPC . Repent of Perish! One way or another, I'll own your asses.
- Product Round-up Smartwatch face off: Pebble, MetaWatch and new hi-tech timepieces
- Geek's Guide to Britain BT Tower is just a relic? Wrong: It relays 18,000hrs of telly daily
- Geek's Guide to Britain The bunker at the end of the world - in Essex
- Review: Sony Xperia SP
- Dell's PC-on-a-stick landing in July: report