When we wrote about the possibilities of green software back in July, the response from Register Developer readers was mainly positive. Thanks for going easy on us. One reader was actually inspired to kick-start his own green software initiative while the issue of green software drew significant comment outside of Reg Dev's …
things aren't what they seem
Any electrical engineering student can debunk this one - In our national power grid (electrical generation/transmission/consumption system) the primary consumer of electricity is the transmission line.
Google has recognized this (they do employ lots of electrical engineers) and is building their server farms as close to the generating plants as possible. They seem to prefer using hydroelectric power plants as opposed to coal fired plants.
Re: google avoiding coal fired plants
Google is betting big on future carbon regulations. They're investing in solar power and other renewables. Coal might be cheep now, but if you're building a huge datacenter, you don't want to be exposed to the significant risk that the power will suddenly become much more expensive (or disappear altogether!) because of carbon taxes / cap and trade schemes. They've famously invested in Ausra, but other solar firms haven't escaped their notice (for datacenters. Their rooftop stuff was all contracted from a different company which isn't interested in the utility-scale stuff).
Trying to reduce the carbon footprint of Google is like trying to rearrange the deck chairs on the Titanic, the size of the datacentres is too big and the power consumption too large for renewables, except on a truly awesome scale, to make any odds. You add the power consumption of the datacentre, the cooling, the network services and power consumption needed to manufacture thousands of servers and disks and you really have a huge environmental problem. The solution to the problem is really rather more basic, we must do more with less, our power consumption must be reduced and letting companies like Google go on building datacentres of essentially infinite size is just storing massive future problems.
"In our national power grid ... the primary consumer of electricity is the transmission line."
Evidence? It seems rather unlikely that power companies would burn half their product on the wires when they could cut those losses by 75% simply by using stronger pylons and thicker wires.
Why just servers?
Last I checked, desktops were using about 6% of the US grid, with datacenters at 3%, shouldn't we be fighting overly bloated desktop software too?
If Microsoft dropped Vista*, and went back to XP, we could easily cut the power consumption of desktops in half in the next few years.
*Whatever other faults and merits Vista has, it guzzles processor cycles, and by extension power, like mad. No normal user should need a high wattage processor, but that seems to be exactly what vista needs. (as bad as 40% processor load, at idle, with a 2 Ghz dual core)
No surprise here
Someone is trying to capitalise on a politically correct and popularly fashionable nonsense. This is not new. People has been doing it for ages: organic food, radium cocktails...
Seriously speaking - it is like trying to save a loss making business by cutting expenses on the flowers in reception while raising the CEO's salary.
The energy needs will grow and the carbon emmissions will increase no matter what saving programs will be attempted. This is the nature of human society - grow or die. Therefore, we will always save in one place at the cost of much higher increase in the other.
Loss-making businesses can only be saved by increase in revenues - that means dynamic development, getting larger market share, innovation. Cost savings only prolong the agony unless they are dramatic (and then it is easy to cut so much as to kill the business altogether).
In the case with carbon emissions - "revenue" is equivalent to *extraction* of carbon from the athmosphere, not mere reduction in emissions. The "dramatic" cost cutting is equivalent to switching to nuclear power generation.
So, unless someone talks in terms of nuclear power, fusion research and mass carbon sequestration - you can safely disregard what they are saying (at least in terms of saving the environment and averting/reversing the global warming).
That includes anyone pushing for energy efficient housing, wind and tidal power, aviation taxes, biofuels, carbon footprints and so on and so forth.
Many years ago I recall writing a special-purpose timer program for a Psion Organiser. I soon found that it ran the battery down rather quickly, and so began measuring the battery current. By inserting various strategically placed "sleep" codes into the program I was able to reduce the power consumption to less than 30% of the unoptimised code.
I imagine most optimisations for speed-efficiency (including use of lookup tables for frequent calculations) will also prove power-efficient. Unfortunately, for common desktop apps, coding for efficiency went out of the window more than a decade ago.
Does anyone know whether US office buildings still try to out-macho each other with colder air-conditioning temperatures ?
A tolerable 18C should be reasonable in summer ... or they should look at some passive cooling techniques that let hot air flow up out of a duct in the roof and bring in cooler ground-level air.
40% at idle? Are you crazy? Yes, Vista is heavy on RAM usage, but at idle it's 2-3% of CPU -- I get that number from my 1.8 ghz AMD Compaq, and my 2.0 Core 2 Duo machine.
Methinks someone needs to lose the malware or add-ons ...
Green computing? Ditch the VM
What's with the big fascination with virtual machine languages? Here you have scads of processor power, and yet its wasted by dropping VMs on everything. Java, .NET, Perl, Python, Ruby, etc.: why? You want to do more with less? Use a real compiled language. C/C++ aren't the only things out there, you know!
Thanks for the laugh.
Yes, virtual machines (bytecode, not OS image, since you weren't clear and the uninitiated might be unfamiliar) are draining the world of all its power. :rolleyes: My electric bill shoots right up when I'm writing and running C# instead of C++. Yep. Dang.
Please, do tell, what serious compiled OO language exists out there other than C++ that doesn't live on a VM these days? One that's been proven scalable and commercially viable? One with any kind of support of the kind require to make for a serious development platform?
PS Java runs natively on Sun equipment. Huge financial and similar datacenters run Java natively to begin with. And they're the big fugly monsters under discussion here to begin with.
- JLaw, Kate Upton exposed in celeb nude pics hack
- Google flushes out users of old browsers by serving up CLUNKY, AGED version of search
- China: You, Microsoft. Office-Windows 'compatibility'. You have 20 days to explain
- GCHQ protesters stick it to British spooks ... by drinking urine
- Twitter declines to deny JLaw tweet scrubdown after alleged iCloud NAKED PHOTOS hack