For the last couple of years, I’ve been yammering about how enterprise analytics (or Big Data, or Predictive Analytics) is going to be the next big thing in business and thus enterprise computing. The major vendors, including IBM, Oracle, HP, and Microsoft, are on board along with pioneers like SAS and Teradata. Everyone is busy …
CPU yes, but what about storage
Although the author's assertion that running on existing hardware will lower cost of big-data analytics, a key point of "Big Data" is not CPU-load, it is "you have lots of low value data to work with". Platform doesn't address that story; they may have better scheduling than Apache Hadoop's out the box schedulers, but their storage story is the same: run HDFS for location-aware storage.
No doubt IBM's story will become that of IBM's grid story: use GPFS, but that increases the cost of storage in exchange for location-independence, which limits the amount of data you can retain.
The problem is...
...this article sounds as though it's written entirely from information gleaned from somewhere like Gartner. Having once worked for a company which rates very highly in several Gartner categories, despite the actual quality of the offering, I can assure you that trends and predictions based on that sort of information are utterly useless.
About the best you can do is have experienced teams working on your projects. They will work around whatever tools corporate have imposed on them and make it work. If you have inexperienced teams, they won't be able to.
Microsoft is a major vendor?
"Microsoft is a major vendor"?
Isn't that a bit of a quaint statement now?