Re: Avoid the commodity
There is another alternative to building bigger and bigger data centres.
Rather than look at the power footprint of the hardware, why not start looking at the power footprint of the software?
Looking at what people are doing on systems nowadays, how much more productive are people with, say, office productivity suites today versus what they had 15 or 20 years ago when systems were a fraction of the computing and consumed power (you only have to look at a 15 year old PC, and spot that the power supply could only supply around 100W. Look at a modern PC, and you will find that 300-500W power supplies are the norm now.)
I know that there are new applications that people use that do need high footprint software (anything to do with high quality media is a prime example), but for many tasks, both on a commercial and a personal basis, modern software is big, bloated, and power hungry.
The power economies available from ARM and Intel Haswell show that there are considerable economies in power, but this has largely been soaked up by software with higher requirements. Reducing the memory footprint and CPU cycles required to run the systems mean that each system will be able to run more work in the same power budget.
I'm not saying that all workloads can have their power significantly reduced, (Big Data and HPC workloads will always be memory and/or processor intensive), but much of VDI and running simple data processing workloads, and running web sites are hugely inefficient because of the way they have evolved and the tools used to write them.
So my view is dump the RAD tools and languages that require 10s of megabytes to run "Hello, World.", and move back to the development of light-weight applications on stripped down OSs, coded by skilled coders who are tasked with writing efficient code, and then run more work on systems with the existing power foorprint.
The cost balance will move from quick to develop but expensive to run, to expensive to develop but cheaper to run, but that equation will shift as power gets more expensive. It will have to happen eventually once computers reach the limit of what can be achieved in the available power budget, but why not start now before the crisis hits us?