This is an interesting discussion which ranged from the practical to the esoteric (quantum computing). Electricity use is certainly a big issue even for smaller facilities. In the UK at least, the cost of powering and cooling machines over 4 years life can now exceed the capital cost. Electricity costs nearly £1 per watt per year. In a space cooled by conventional air conditioning, about £1.50 per watt of equipment per year. Users should be costing this for all purchases.
GPUs, accelerators like those from Clearspeed, and novel architectures like Sun Coolthreads all have a contribution to make in particular problems areas.
For general purpose computing there are simple savings which can be be made. For example AMD Opteron "HE" chips run at 68 watts rather 95 watts. A second power supply a 1u server adds around 20 watts to the electrictrity consumption. On subsystems, 2.5" disks use less power than 3.5" disks, and in general, portable technology uses a lot less power than desktop or server technology.
On the desktop the most efficient "PC" is actually the Mac Mini which make use of portable technology to use only 37 watts when working hard, 22 watts when idle and 2 watts when hibernating. A typical PC will use more than double this amount of power, but obviously the thing to do is to make sure PCs go to hibernate when idle.
Focusing on total life cost, including power, rather than on capital cost could bring a major change to the market.