Yet another exaggerated power saving claim...
"Swap 1,000 PCs for a bunch of servers and the same number of thin clients, and you'll divide your electricity consumption by ten."
Nonsense. Given that at the very least a thin client is going to require a screen, then this implies that this component must be responsible for only 10% of your PC's power consumption. A typical corporate PC does not have to be a power-hungry monster with the latest 3D graphics card. For many it is a laptop. You can get a pretty powerful desktop running, on average, with less than 120W. Laptops or desktops using low power techniques can average down in the 50W region. Then your thin client will require at least some processing power of its own plus to offset the loss of local computing you will need more processing and storage at the centre (which will need air-conditioning) and higher speed (and more energy consumptive) networks. Then there is the little point that, during the winter at least, the cost of power used by local office computing is offset by some reduction in heating costs.
No - these claims of 90% reductions come from the same sort of unreliable sources that claim mobile phone chargers consume 4-5W from just being plugged in (witness all the adds all over London). Take a modern phone charger, measure it and you'll find the real figure is less than half a watt.
The answer to all this stuff is undoubtedly to produce more energy efficient appliances. But there is plenty of scope to do that on personal computers of whatever sort. Just don't put gaming cards and top-end processors in office PCs.
There are valid reasons for moving much computing resource centrally, but reduced electricity costs isn't the main one.