back to article Standby consumes MORE POWER THAN CANADA: IEA

The International Energy Agency (IEA) is worried that the world's addiction to gadgets that sip electricity in standby mode use more power than is necessary or sensible, and wants manufacturers to try harder to cut power consumption. The agency says inefficient “network standby” modes are common: consumers think a device has …

  1. Mark 85 Silver badge

    Ah.. another good reason not to buy "enabled" appliances in addition not no wanting my refrigerator to tell me I'm drinking too much beer and the milk is going bad.

    1. big_D Silver badge

      I made the mistake of buying the original Apple TV.

      Apple's understanding of the device being "off" / in stand-by is to just turn off the video output. The processor still runs at full speed and the hard drive still spins at normal speed. The only way to really turn it off is to pull the power cord!

      1. Vic

        Apple's understanding of the device being "off" / in stand-by is to just turn off the video output. The processor still runs at full speed and the hard drive still spins at normal speed

        Sky decoders used to be the same - putting the device into "standby" meant turning off the A/V outputs and turning the front-panel LED red.

        These boxes need to be tuned to a stream, need to have their demux running, and need to have the CPU decoding some of the table information in the stream. This is how the Conditional Access stuff works, so I imagine it is still the case with current builds.

        Does anyone know what the energy impact of DAB would be if anyone used it?


        1. Steve Todd

          Does anyone know what the energy impact of DAB would be if anyone used it?

          Given that the latest DAB chipsets are down in about the 2-3W range when they are working, then not a lot.

        2. garden-snail
          Thumb Up

          Sky boxes

          The more recent Sky+HD boxes now have an "eco" mode (on by default), which turns the box into a low power mode overnight if the box is not in use. The LNB power is cut, disk spins down, network disconnected - basically everything is off but for an RTC that wakes it up in the morning or for a scheduled recording or software/CA update.

          During the day, the disk powers off when not in use as well (the very early Sky+ boxes used to spin the disks 24/7).

      2. Michael Wojcik Silver badge

        Apple's understanding of the device being "off" / in stand-by is to just turn off the video output. The processor still runs at full speed and the hard drive still spins at normal speed. The only way to really turn it off is to pull the power cord!

        This is true of the DVRs I've had too - a very small sample, admittedly, but I suspect it's broadly true of set-top boxes and the like. Why would it be different? The simplest implementation is to have "standby" simply appear to the user to have disabled the device's user-visible functionality. Consuming power is an externality for the vendors of such devices.

  2. JeffyPoooh Silver badge

    Speaking of Canada...

    One version of this story, on, made special mention of "tablets" as being an example of a type of gadget where they burn too much power in Standby mode. {Rolls-eyes}... Yeah, standby mode, where the relatively tiny battery lasts for days.

    Anyway, most the appliances I've bought recently are already off the scale (low end) in terms of their energy consumption including standby. Their EnergyStar/Energuide labels shows a range (e.g. 50" TV, $21-$67 per year), and my selected 50" TV is rated at $14 (setting a new record for next year's labels). Standby "<0.5w". My fridge and freezer are similarily off the scale.

    It's a solved problem. Thus it's time to be extremely specific in the Name & Shame approach. Not just "game consoles", but specific models.

    1. veti Silver badge

      Re: Speaking of Canada...

      Right. Thinking of my Wii - the only games console I've ever had - I'm pretty sure the standby mode doesn't consume much power, because of one simple test: when it's switched off at the box (LED turns red, as opposed to the orangy-sort-of-colour it turns when you just use the remote), the box actually cools down.

      Same story for my home PC. When in Sleep mode, the fan isn't going, nearly all the LEDs are off, and there's no warmth coming from it. I'm not sure how much power it is consuming exactly - I know it's non-zero, because of what happens when there's a power cut - but I also know, from easily perceivable evidence, that it's only a tiny fraction of what it consumes when it's working.

      However, the 'Internet of Things' would require connectivity to be maintained 24/7. And that will waste a phenomenal amount of power, if we're dumb enough to fall for it.

      1. Don Jefe

        Re: Speaking of Canada...

        To be fair, heat isn't a very good indicator of draw. The primary heat source in a console and general use computers is related to graphics processing. In standby mode, graphics processing, obviously isn't a factor. It's a matter of how the heat is created, not what the other less operation intensive, but still (comparatively) significan draw, lower density component packages are doing.

        1. This post has been deleted by its author

          1. Paul Crawford Silver badge

            Re: Heat

            Yes & No.

            Of course heat is the end product of all losses, but if you have a ~100W device on full power device that heats the touchable case by ~20C above ambient to dissipate said heat, you will hardly notice if it is down to 5W, 1W or 0.2W when on stand by without a lot of careful measurement as you would be looking at order of 1C or less.

          2. Don Jefe

            Re: Speaking of Canada...


            No. That's the most basic understanding of electricity and, like most things, isn't as simple as you'd like it to be when you go to apply it.

            Heat as a function of draw is only, kind of, valid as a comparator for like vs like. You can make some, crude, assessments like that if you are comparing two AGP's or two LED's. But that doesn't work if you are trying to compare draw from a high density component like an AGP to the system wide draw of the rest of the components. It is entirely possible, and not uncommon, for the total draw of a system to greatly exceed the draw of the hottest component in a system but still produce less heat.

            1. Missing Semicolon Silver badge

              Re: Speaking of Canada...


              It does not matter if a device consists of many small heat emitters, or one big hot one. For a given power input, you get the same heat output.

              Now, the temperature of the one big hot thing may be higher, but the amount of heat will be the same.

              Basic thermodynamics.

              1. Don Jefe

                Re: Speaking of Canada...

                No. That's efficiency. Two systems with identical input and output can generate significantly different amounts of heat by using components of different efficiencies.

                Basic applied thermodynamics.

                1. veti Silver badge

                  Re: Speaking of Canada...

                  "Two systems with identical input and output" - really can't generate significantly different amounts of heat, if "heat" is part of the "output" you're looking at.

                  If you're talking about, say, light output, or computational power, then of course your statement becomes true. But irrelevant, because the more efficient system (assuming, of course, that the system as a whole isn't intended to produce heat as part of its primary functionality) is the one that produces less heat.

                  In equilibrium, energy out == energy in. That's basic thermodynamics. It doesn't matter if some components are producing more heat than others - if you're looking at your console (or whatever) as a black-box system, then heat is an excellent proxy measure for how much total power it's using.

                2. Missing Semicolon Silver badge

                  Re: Speaking of Canada...


                  PC 1 is a museum piece from the 90's, with a Pentium 100. it draws 100W at the plug.

                  PC 2 is a modern Core i5 from PC World. Draws 100W.

                  Plainly the new one gets much more done for the power used. But which one "makes more heat" i.e., which one warms the room most?

                  Answer. Neither.

                  Each is dissipating exactly 100W of energy in the form of heat. The energy consumed by the fans, moving air about, eventually becomes heat as the air slows down, for example.

                3. Metrognome

                  Re: Speaking of Canada...

                  That's a bit of a circular reference problem. If two bits of kit have identical input and output, then they ARE by the very definition producing the same heat (ie emit energy in heat form).

                  What effect that energy/heat produces in terms of temperature elevation depends on other factors like size, thermal conductivity, materials, surrounding environment etc.

        2. proud2bgrumpy

          Re: Speaking of Canada...

          Actually, heat is a direct indicator of energy usage. 1kW/Hr = 3,412 BTU/Hr. It doesn't matter what the energy is being used for - electricity is electricity whether its spinning a disk, computing graphics or spinning a fan to push the heat somewhere else.

  3. Charles Manning

    It comes down to power supply efficiency

    It is pretty much impossible to design a switch mode power supply that is efficient at both low and high powers. Thus, most of the effort goes into designing a power supply that is relatively efficient when used at high power (ie. On).

    When in standby, either the same power supply is used, and efficiency suffers badly, or you have a second power supply that is there to handle the standy power levels. That obviously drives up circuit complexity and costs a bit more.

    These days there seem to be directives to keep standby power low. Still, if you have a house full of 8W LED bulbs each chomping 0.5W 24/7 in standby mode, that is still quite a bit of leccy.

    1. DropBear Silver badge

      Re: It comes down to power supply efficiency

      ...except if you properly power down your electronics and consumption goes from amps to microamps or nanoamps, you won't care much at all that you supply that amount with a lousy 30% efficiency, innit...?

    2. This post has been deleted by its author

    3. Nigel 11

      Re: It comes down to power supply efficiency

      It is pretty much impossible to design a switch mode power supply that is efficient at both low and high powers.

      Not being an electronics engineer I won't say "bollocks"

      But surely it's not beyond the wit of man to design a power supply that is integrated with a small rechargeable battery pack. The power supply would turn off leaving the standby electronics running off the battery. They'd be capable of commanding the power supply back on to recharge the battery as fast as possible when it was close to empty, and then off again. In normal operation that wouldn't happen, because the device would get used before the battery was close to empty.

      You'd need a real switch to get it full-on if the battery had gone flat because of a long peropd without a mains connection. It might work better with an ultracapacitor instead of a battery (but NiMH cells are very reliable and could be user-replaceable).

      Methinks it's a cost issue not a technological issue, and it needs legislation to outlaw devices with inefficient standby. Otherwose there's always an incentive to save pennies (straight to the bottom line!) by shipping inefficient devices.

      1. Jamie Jones Silver badge
        Thumb Up

        Re: It comes down to power supply efficiency

        Nigel 11, it doesn't matter that you aren't an electronic engineer in this case - your 'common sense' is sufficient (I did electronic engineering at university and can assure you many of my fellow students wouldn't have had this idea)

        But yeah, I've basically advocated EXACTLY as you describe (maybe we should go into business together!) - yup, basically use a battery (or maybe capacitor if appropriate) to run 'standby' mode, ensuring the power supply is off entirely, and as you say, only power it up if the unit is 'switched on' or the battery needs charging, with the hardwired override button for those times the battery is dead - just like you describe!

        Seems obvious to me!

    4. JeffyPoooh Silver badge

      Re: It comes down to power supply efficiency

      Chas: "...a house full of 8W LED bulbs each chomping 0.5W 24/7 in standby mode..."

      Huh? 8 watt LED light bulbs are *universally* installed into fixture with a hard power switch. When you turn off a light bulb, LED or otherwise, it's flippin' OFF.

      There's NO SUCH THING as standby mode for a light bulb. At least not at this juncture in history.

      1. Don Jefe

        Re: It comes down to power supply efficiency


        Yes. Lots of lightbulbs do indeed have a standby mode, but I suspect not many people have them in their homes.

        Modern 'Explosion Proof' lighting assemblies use standby to prevent sudden state change from potentially causing a spark (and explosion if it's a bad day). A convenient side effect is that the bulbs last 'forever' as they never experience the cold to wide open state change that kills other bulbs.

        High intensity HID spotlights, emergency lights, construction lights, makeup lights and shitloads of specialty videography and photography lights all have standby for the same reason as above and so that they provide their full power and proper color of light immediately after being turned on. Old, pre-standby lights of those sorts experienced significant color shifts as they 'warmed up'. The end result was that it was hotter than hell where those lights were in use because they were left on to eliminate the color shift period.

        Most streetlights in the US have standby. Mostly for filament longevity purposes. There was a big grassroots movement here in the late '90's when the proles discovered that their streetlights were sucking so much power and in older lights were actually on constantly, at reduced levels, by virtue of current leaks, but nobody knew it because you can't see them burning in the daylight.

        I'm sure some home lights have standby as well, but most people have no need for such things and would never even go looking for them.

        *New lights in reef aquarium setups have standby too. People probably do have that sort of thing in measurable quantities.

    5. Missing Semicolon Silver badge

      Panasonic LCD telly..

      ... seems to have a low power mode. When you press "standby" it goes dark - about 5 minutes later, you hear a "tick" of a relay changing. If you subsequently switch it on again, starting up takes as long as it does from cold.

      I'm guessing that it has two power supplies, a big one to run it normally, and a weeny one to power the remote control receiver.

  4. Anonymous Coward
    Anonymous Coward

    It's really about standby energy, not standby power

    My Wii consumes only 1/10 of the power in standby as it does fully operating (ballpark). However, it's in standby for more than 95% of the time. So, its standby accounts for 66% of its lifetime energy consumption. That's a very large slice of the pie for next to nothing in return for me, the end user.

  5. StimuliC

    Well it's not hard

    It's quite an irrelevant comment. They take a big country with not that many people relative to size and make a comment about Standby power using more power than Canada. The Standby use in California probably uses as much. Heck I probably use more than Canada in that scenario.

    1. Don Jefe

      Re: Well it's not hard

      Please reorient your map, or globe, so that Canada is near the top (or bottom) relative to the floor in your house.

      In this orientation it should be much easier to see that Canada is not generally considered to have a traditional, equatorial climate. In fact, recent satellite imagery has confirmed that Canada is nearer to as far the fuck away from the equator as it can be. There is a growing consensus among climate scientists that Canada might be really god damn cold for 80% of the year. The few Canadians that have been studied are purported to live in small, dome shaped shelters made of stacked snow bricks, called 'igloos' and dress themselves in baby seal skins for protection from the cold. Little else is known about these hardy people.

      Pending future expeditions across the tundra and into the mythical realm where legends of monstrous white bears and rabid white foxes frolic, all tales of large, cosmopolitan cities with excellent education and healthcare systems, good beer and stories of a tribe of lost Frenchmen are to be considered myth.

      1. Boris the Cockroach Silver badge

        Re: Well it's not hard

        You forgot the 2 months of 'summer' when mosquitos the size of large birds infest the air sucking the life out of any fool brave enough to dare to walk outside....

        Well it was featured on Sci-fi channel so it must be true

      2. Androgynous Cupboard Silver badge

        Re: Well it's not hard

        What's that? Consensus amongst climate scientists you say? Nonsense! I think you'll find Canada is smack bang on the equator and that any coldness is due to natural variations in fridge door opening.

  6. Metrognome

    Pointless and confusing

    I'm with Cambridge professor McKay on this one. (

    He made a very serious attempt at quantifying what it will take to be sustainable.

    My best quote: "if everyone does a little, then all we can achieve is a little" (such a no brainer mathematically it makes you wonder why people didn't see through that 'every little helps' slogan).

    At the end of the day if we want to really save energy our best bets are insulating our homes and assisting our boilers with solar panels or underground heat exchangers.

    1. Don Jefe

      Re: Pointless and confusing

      Yeah. People really suck at math. Some of the worst at it I've ever seen are the CFO's of energy providers. They get all bent out of shape when residential consumption in an area goes down .05%. If those idiots only knew little things didn't have a significant impact their jobs would be a lot easier. I'm sure it's probably a job security thing. If they didn't make a big deal out of the little stuff they would probably be made redundant.

  7. Ian Emery Silver badge

    The IEA have no credibility due to their repeated over-inflation of power consumption costs by at least a factor of ten in most, if not all cases.

    My guess is they pick the most power hungry (ancient) model in each category and then "cost" it using the price of the most expensive electricity on the planet.

  8. Anonymous Coward
    Thumb Down


    Given that security isn't anywhere to be seen, beyond normal HTTP not HTTPS, whatever makes you think power efficiency is a consideration?

  9. James Boag

    look on the bright side

    I've just discovered i can run my raspberry pi from a pc power supply on standby. And fire it up fully when i want to use more power.

  10. Anonymous Coward
    Anonymous Coward

    Need to be on?

    I want presence or proximity sensing technology in the house, not being a big oven user why can it be on when I am not there? (and never leave it on intentionally while absent).

    I want to be able to categorise appliances (a very quick category ideas).

    1. Stuff that carries some risk and does not ever need to be on when the house is empty (kettle, Iron, power hand tools, shower).

    2. Things that may be left on but need to be acknowledged on exit (slow cooker, washing machine, dehumidifier)

    3. Junk, stuff that really has low risk but should be off unless its power schedule permits for updates (games console, recorders, PC)

    There could be another couple like remote "graceful shutdown" and "power cut" so if the house is remotely monitored and severe electrical storm approaches or fire is identified the big red button can be pressed.

    That said my biggest saving on electricity was the power company replacing a faulty meter, only really spotted the issue as I was using and spotted the display fading when taking readings, that also allowed me to see the change when the meter was replaced.

  11. Anonymous Coward
    Anonymous Coward

    Remembering settings

    How do devices remember their user settings when powered off at the wall? Is it a large capacitor or a non-volatile memory device?

    My TEAC radio/hi-fi amp loses its station presets if powered off for more than a couple of hours. My bedside TEAC mini media centre forgets the volume/bass/treble settings if the power is even blipped. The station presets are remembered - but not sure for how long during a proper outage. Other household radios seem to lose their settings after about a day of no power.

    My PC monitors remember all their settings for days, and probably longer, without power. The most efficient device appears to be my large LCD radio-synchronised clocks - which use one "AA" battery every two years. The central heating timer usually forgets its settings if there is a power failure. The "PP9" back-up battery is invariably dead on these rare occasions - and is a pig to replace.

    A few years ago the office decided to save power by turning electronic devices off at the wall at the end of the day - rather than let them sit there in stand-by. Within a month about a third of the devices needed repairing.

    1. Steve Todd

      Re: Remembering settings

      There are these devices called EEPROMS that can store a few hundred K of data power free. A number of microcontrollers have them built in. That or there's something called flash memory, maybe you've heard of it? PCs tend to have a battery backed clock (powered by a small lithium cell) and EEPROM combined, so they stand for years without power and still start correctly.

      The reason that devices like to be left running is something called heat cycling. Components expand as they warm up, and contract when they cool down. This results in failures due to mechanical stresses.

  12. Jon Kinsey

    The answer is obvious

    Turn off the power in Canada.

  13. Andrew Jones 2

    I wish the people who decided to declare Standby as being bad for the environment actually understood why Standby exists in the first place. Let's all turn our devices OFF when we finish using them and then in 6 months time - there will be new reports from confused politicians who can't understand why the amount of electrical devices entering landfill has tripled.....

    Meanwhile - feel free to replace your car with an electric one, because from all the pushing from politicians and "energy experts" you'd be forgiven for believing that the electricity they use just appears out of nowhere and doesn't have to be generated first......

    1. Steve Todd

      "Meanwhile - feel free to replace your car with an electric one, because from all the pushing from politicians and "energy experts" you'd be forgiven for believing that the electricity they use just appears out of nowhere and doesn't have to be generated first......"

      Power generated by renewables is intermittent in nature and hard to store. That's one source. The second (existing and well known) source is the large gap between capacity and demand over night. They already sell off-peak power cheaply in an attempt to get people to use it. You don't need any more generating capacity in order to cope with (IIRC) 30% of the country's traffic being electric powered.

    2. Don Jefe

      @Andrew Jones

      I'm not sure about your second point. I agree that biased advocates (of anything) and politicians can't be relied upon to know what they're talking about. But it's a pretty broad stroke to include target customers for the current crop of electric cars in the 'somebody recognizable said (x) so it has to be true' category. Yes, there are idiots there too, no doubt, but not nearly as many as in the general population. The non-idiots in that target demographic tend to be the same educated professionals that are nearly impossible to market to because they are fairly independent and tend to reach their own conclusions in their own way. I'm not trying to imply they are smarter or anything, just that 3rd party influence is largely wasted on them because of their decision making process.

      But you're spot on with the first point. Many moons ago I was sent to sort out a printer manufacturer that had been acquired by my employer (this was back when consumer printers still cost more than their ink refills). Their warranty losses on consumer products alone were more than enough to get the books right side up so that's what we focused on. The consumer models were basically the commercial models with smaller paper capacities and such but the commercial models rarely failed. We finally found the problem and it lay in the fact that more than 2/3 of the consumer users printed for less than a single hour per year (about 30 pages) and the rest of the time the printer was turned off. The commercial units were almost never turned off, thus no chain of similar failures for comparison. For a bit less than $1.75 per unit a retrofit standby solution was developed and within a year the company was in good enough shape to be sold to someone else.

      There's no doubt that standby is important and people would be very upset if standby wasn't so common.

    3. Jamie Jones Silver badge

      I agree about power-cycling, but then it depends on how deep 'standby' mode is.

      Many white goods power just about everything off but the wake up circuit meaning that that big old transformer keeps humming away, providing no more benefit to the on/off cycling (except, of course for the transformer unit itself!)

  14. Bob 5

    In the 70s I had a 25" colour TV that consumed 250W. Now I have a 55" colour TV that consumes 43W. Similarly standby power consumption on devices like TVs has reduced from tens of watts a decade or two ago to normally less than a watt now, (Last time I checked, European directives specified 0.5W/1W max. standby power depending on application), so energy waste is being tackled and there is no doubt standby powers can be further reduced to microwatt levels. TV/Radio etc settings can be stored in NVRAM leaving only a micropower infra-red receiver functioning powered by a micropower capacitive coupled low loss psu awaiting the wake-up command.

    In a switch mode psu, burst mode can only reduce consumption at low output levels by reducing switching losses - the efficiency of any psu at zero output is still 0% - the aim is, as always, is to reduce the standby consumption.

    Unreliability caused by frequent heat cycling of intermittently powered devices is something that has to be eliminated by design - the aim being to make it unnecessary to permanently power a device to prolong its reliability - that is sooo wasteful. In my experience such unreliability isn't a great problem anyway - my TVs, computers, radios etc are all turned off when not in use and I have not experienced any untoward failures. Remember in electronic devices there are items that have a limited life, in particular standard electrolytic capacitors, (e.g. motherboard capacitors.....), so in such cases any unreliability caused by heat cycling could be more than offset by the life gain of switching off when not in use. Check the manufacturers rated life of electrolytic capacitors - you may be surprised.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019