So what's new about putting huge data centres next to supplies of cheap energy (or other energy-intensive industries like aluminium smelting). Aren't Google, Microsoft and others placing data centres next to hydro systems? Aren't their companies looking at geo-thermal sites for large data centres? The problem with relying on wind power is it is so unreliable - any giant data centre is going to require its own giant generator for windless days; that's unless it relies on power over the grid and that means transmission losses to/from remote locations which I think is where we came in.
As for reduced processor requirements at the desktop, think again. Anybody using Flickr with a few thousand photos and a couple of hundred sets and collections will find their CPU flat-out at times when organising photos. Then there's all that high-resolution video stream processing.
That's without considering all activities, like gaming, video processing, image processing and other graphically intensive processes which benefit from lots of local processing, local media distribution. On top of that, all the increased network bandwidth comes at a cost with all that extra network kit required.
The answer to the local processing side is smart technology which adapts its power usage to the processing demand - there is already a lot of progress in this field.
So by all means put the server farms next to the generator capacity, but as a means of reducing local power requirements - I think not.