Canonical is accelerating Ubuntu's push into the cloud, delivering an integrated stack of cloud platforms ready for download. Canonical has revealed that it's working with open-source project Hadoop and NoSQL database providers to deepen the level of integration between these big-data technologies and the Linux distro's next …
A note from the UK
May I be the first British person to point out that "fluff" generally refers to a polite way of saying a 4 letter word starting with F and rhyming with duck.
Your probably the only british person to use fluff in this way too. The rest of us just cut to the chase, no fluffing around.
Speaking as a Brit, "fluff" means s light, downy material. Possibly something like lint or, gosh, a cloud. Or even something of little significance.
Where you get the notion that "fluff" is euphemism for coitus in British English eludes me, I have never heard it used in that context at all.
google definition of fluffed:
2. Fail to perform or accomplish (something) successfully or well (used esp. in a sporting or acting context): "the extra fluffed his only line"
Another note from the UK
Well, sorta. I was thinking more fluffed = blown, which is roughly how I imagine the author intended it, except that he's fluffing and blowing in preparation for the big scene whereas I've blown it.
Commercial Hadoop support. Brave.
I know it just sounds easy to support "another" OSS project, but Canonical are going to find Hadoop different. Facebook, Yahoo! and others worry about any patch that could threaten their data, and to cut a release someone needs to stress test it on a real -not a virtual- datacentre with 1500+ servers. the current problems we are worrying about are disk balancing on 12HDD servers, better sharing of idle cluster time with other applications (through better reporting), etc etc -stuff you only find a problem on big clusters.
This is why the current Hadoop recommended config is always a specific Sun JVM that everyone is happy with, a specific RHEL/CentOS OS image that has also been tested by the big players at the petabyte scale.
If Canonical end up producing yet another Hadoop branch, that's just going to cause trouble. But without the ability to get their patches into the core -which they currently lack, not having any committers- that's exactly what's going to happen.
What is good here is that they could tweak ubuntu to work better with hadoop, such as
* allowing us to turn IPv6 off
* not defaulting to adding an entry to /etc/hosts that maps the local hostname to 127.0.0.1. This may make sense for laptops, but not machines in a datacentre.
"Canonical's end goal seems to be single-click deployment for a cloud from the Ubuntu command line"